专利摘要:
There are several systems and methods for evaluating a surgical team. A computer system, such as a central surgical controller, can be configured to be communicably coupled to a surgical device and a camera. The computer system can be programmed to determine contextual information regarding a surgical procedure based, at least in part, on perioperative data received from the surgical device during a surgical procedure. In addition, the computer system can visually determine a physical characteristic of a member of the surgical team through the camera and compare the physical characteristic to a baseline to assess the member of the surgical team.
公开号:BR112020013047A2
申请号:R112020013047-8
申请日:2018-11-14
公开日:2020-12-01
发明作者:Frederick E. Shelton Iv;Jason L. Harris;Taylor W. Aronhalt
申请人:Ethicon Llc;
IPC主号:
专利说明:

[001] [001] This application claims the benefit of the non-provisional Patent Application Serial No. 16 / 182,255, entitled USAGE AND TECHNIQUE ANALYSIS OF SURGEON / STAFF PERFORMANCE AGAINST A BA-
[002] [002] The present application claims priority under code 35 USC $ 119 (e) to provisional US Patent Application No. 62 / 729,191, entitled SURGICAL NETWORK RECOMMENDATIONS FROM RE-TIME TIME ANALYSIS OF PROCEDURE VARIABLES AGAINST A BA- SELINE HIGHLIGHTING DIFFERENCES FROM THE OPTIMAL SO-LUTION, filed on September 10, 2018, the disclosure of which is hereby incorporated.
[003] [003] The present application claims priority under 35 USC $ 119 (e) to provisional Patent Application 62 / 692,747, entitled SMART ACTIVATION OF AN ENERGY DEVICE BY ANOTHER DEVICE, filed on June 30, 2018, to the Application US Provisional Patent No. 62 / 692,748, entitled SMART ENERGY ARCHITECTURE, filed on June 30, 2018 and US Provisional Patent Application No. 62 / 692,768, entitled SMART ENERGY DEVICES, filed on June 30, 2018, the disclosure of each of which is incorporated herein by reference, in its entirety.
[004] [004] This application also claims priority under 35 US $ 119 (e) of provisional US Patent Application 62 / 659,900, entitled METHOD OF HUB COMMUNICATION, filed on April 19, 2018, the disclosure of which is hereby incorporated by way of reference in its entirety.
[005] [005] The present application also claims priority under US $ 35 US $ 119 (e) of provisional US Patent Application 62 / 650,898 filed on March 30, 2018, entitled CAPACITIVE COUPLED RETURN PATH PAD WITH SEPARABLE ARRAY ELEMENTS, from US Provisional Patent Application Serial No. 62 / 650,887, entitled SURGICAL SYSTEMS WITH OPTIMIZED SENSING CAPABILITIES, filed on March 30, 2018, from US Provisional Patent Application Serial No. 62 / 650,882, entitled SMOKE EVACUATION MODULE FOR INTERACTIVE SURGICAL PLATFORM, filed on March 30, 2018, and US Provisional Patent Application Serial No. 62 / 650,877, entitled SURGICAL SMOKE EVACUATION SENSING AND CONTROLS, filed on March 30, 2018, whose disclosure of each one is hereby incorporated by reference, in its entirety.
[006] [006] The present application also claims priority under 35 US $ 119 (e) of US provisional patent application serial number 62/640 417, entitled TEMPERATURE CONTROL IN ULTRASONIC DEVICE AND CONTROL SYSTEM THEREFOR, filed on March 8, 2018 , and US Provisional Patent Application Serial No. 62/640 .415, entitled ESTIMATING STATE OF ULTRASONIC END EF- FECTOR AND CONTROL SYSTEM THEREFOR, filed on March 8, 2018, the respective disclosure of which is hereby incorporated by reference, in its entirety.
[007] [007] The present application also claims priority under 35 U.S.C. $ 119 (e) of provisional US Patent Serial No.
[008] [008] The present disclosure relates to several surgical systems. Surgical procedures are typically performed in theaters or surgical operating rooms in a health care facility, such as a hospital. A sterile field is typically created around the patient. The sterile field may include members of the brushing team, who are properly dressed, and all furniture and accessories in the area. Various surgical devices and systems are used to perform a surgical procedure. SUMMARY
[009] [009] In a general aspect, a computer system configured to be communicatively coupled to a surgical device and a camera. The computer system comprises a processor and a memory attached to the processor. Memory stores instructions that, when executed by the processor, cause the computer system to: receive perioperative data from the surgical device; determine a surgical context based, at least in part, on perioperative data; receive an image of an individual through the camera; determine a physical characteristic of the individual from the image; recover a baseline physical characteristic corresponding to the surgical context; and determine whether the individual's physical characteristic deviates from the baseline physical characteristic.
[0010] [0010] In another general aspect, a method implemented by computer to track an individual's physical characteristic. The method comprises: receiving, by a computer system, the perioperative data of a surgical device; determine, through the computer system, a surgical context based, at least in part, on perioperative data; receive, by the computer system, an image of the individual through a camera coupled in a way communicable to the computer system; determine, through the computer system, a physical characteristic of the individual from the image; to recover, through the computer system, a physical baseline characteristic corresponding to the surgical context; and determining, through the computer system, whether the individual's physical characteristic deviates from the baseline physical characteristic.
[0011] [0011] In yet another general aspect, a computer system configured to be communicatively coupled to a surgical device and a camera. The computer system comprises a processor and a memory attached to the processor. The memory stores instructions that, when executed by the processor, cause the computer system to: receive perioperative data from the surgical device; determine a surgical context based, at least in part, on perioperative data; receive an image of an individual through the camera; determine a physical characteristic of the individual from the image; transmit the identification data of the physical characteristic and the surgical context to a remote computer system; being that the remote computer system determines a physical baseline characteristic corresponding to the surgical context and the physical characteristic according to the aggregated data from a plurality of comparison systems.
[0012] [0012] The various aspects described here, both with regard to the organization and the methods of operation, together with objects and additional advantages of the same, can be better understood in reference to the description presented below, considered together with the attached drawings as follows.
[0013] [0013] Figure 1 is a block diagram of an interactive surgical system implemented by computer, according to at least one aspect of the present disclosure.
[0014] [0014] Figure 2 is a surgical system being used to perform a surgical procedure in an operating room, according to at least one aspect of the present disclosure.
[0015] [0015] Figure 3 is a central surgical controller paired with a visualization system, a robotic system, and an intelligent instrument, according to at least one aspect of the present disclosure.
[0016] [0016] Figure 4 is a partial perspective view of a casing of the central surgical controller, and of a generator module in combination received slidingly in a casing of the central surgical controller, according to at least one aspect of the present revelation.
[0017] [0017] Figure 5 is a perspective view of a generator module in combination with bipolar, ultrasonic and monopolar contacts and a smoke evacuation component, in accordance with at least one aspect of the present disclosure.
[0018] [0018] Figure 6 illustrates different power bus connectors for a plurality of side coupling ports of a side modular cabinet configured to receive a plurality of
[0019] [0019] Figure 7 illustrates a vertical modular housing configured to receive a plurality of modules, according to at least one aspect of the present disclosure.
[0020] [0020] Figure 8 illustrates a surgical data network that comprises a modular communication center configured to connect modular devices located in one or more operating rooms of a health care facility, or any environment in a utility facility specially equipped for surgical operations, in the cloud, in accordance with at least one aspect of the present disclosure.
[0021] [0021] Figure 9 illustrates an interactive surgical system implemented by computer, according to at least one aspect of the present disclosure.
[0022] [0022] Figure 10 illustrates a central surgical controller that comprises a plurality of modules coupled to the modular control tower, according to at least one aspect of the present disclosure.
[0023] [0023] Figure 11 illustrates an aspect of a universal serial bus (USB) central network controller device, in accordance with at least one aspect of the present disclosure.
[0024] [0024] Figure 12 is a block diagram of a cloud computing system that comprises a plurality of intelligent surgical instruments coupled to central surgical controllers that can connect to the cloud component of the cloud computing system, according to with at least one aspect of the present revelation.
[0025] [0025] Figure 13 is a functional module architecture of a cloud computing system, according to at least one aspect of the present disclosure.
[0026] [0026] Figure 14 illustrates a diagram of a surgical system with situational recognition, according to at least one aspect of the present disclosure.
[0027] [0027] Figure 15 is a timeline that represents the situational recognition of a central surgical controller, according to at least one aspect of the present disclosure.
[0028] [0028] Figure 16 is a diagram of an illustrative operating room (SO) configuration, according to at least one aspect of the present disclosure.
[0029] [0029] Figure 17 is a logical flow chart of a process for visually evaluating members of the surgery team, according to at least one aspect of the present disclosure.
[0030] [0030] Figure 18 is a diagram illustrating a series of models of a member of the surgical team during the course of a surgical procedure, according to at least one aspect of the present disclosure.
[0031] [0031] Figure 19 is a graph representing the measured posture of the surgical team member illustrated in Figure 18 over time, in accordance with at least one aspect of the present disclosure.
[0032] [0032] Figure 20 is a representation of a surgeon holding a surgical instrument, according to at least one aspect of this disclosure.
[0033] [0033] Figure 21 is a scatter plot of the pulse angle versus results of the surgical procedure, according to at least one aspect of the present disclosure. DESCRIPTION
[0034] [0034] The applicant for this application holds the following US Patent Applications, filed on November 6, 2018, the disclosure of which is incorporated herein by reference,
[0035] [0035] and US Patent Application No. 16 / 182,224, entitled SURGI-CAL NETWORK, INSTRUMENT, AND CLOUD RESPONSES BASED ON VALIDATION OF RECEIVED DATASET AND AUTHENTICATION OF ITS SOURCE AND INTEGRITY;
[0036] [0036] and US Patent Application No. 16 / 182,230, entitled SURGICAL SYSTEM FOR PRESENTING INFORMATION INTERPRETED FROM EXTERNAL DATA;
[0037] [0037] and US Patent Application No. 16 / 182,233, entitled MODIFI- CATION OF SURGICAL SYSTEMS CONTROL PROGRAMS BASED ON MACHINE LEARNING;
[0038] [0038] and US Patent Application No. 16 / 182,239, entitled ADJUSTMENT OF DEVICE CONTROL PROGRAMS BASED ON STRATIFIED ED CONTEXTUAL DATA IN ADDITION TO THE DATA;
[0039] [0039] and US Patent Application No. 16 / 182,243, entitled SURGI-CAL HUB AND MODULAR DEVICE RESPONSE ADJUSTMENT BA- SED ON SITUATIONAL AWARENESS;
[0040] [0040] and US Patent Application No. 16 / 182,248, entitled DETECTION AND ESCALATION OF SECURITY RESPONSES OF SURGI-CAL INSTRUMENTS TO INCREASING SEVERITY THREATS;
[0041] [0041] and US Patent Application No. 16 / 182,251, entitled INTERACTIVE SURGICAL SYSTEM;
[0042] [0042] and US Patent Application No. 16 / 182,260, entitled AUTOMATED DATA SCALING, ALIGNMENT, AND ORGANIZING BASED ON PREDEFINED PARAMETERS WITHIN SURGICAL NETWORKS;
[0043] [0043] and US Patent Application No. 16 / 182,267, entitled SENSING THE PATIENT POSITION AND CONTACT UTILIZING THE MONO- POLAR RETURN PAD ELECTRODE TO PROVIDE SITUATIONAL AWARENESS TO A SURGICAL NETWORK;
[0044] [0044] and US Patent Application No. 16 / 182,249, entitled POWE- RED SURGICAL TOOL WITH PREDEFINED ADJUSTABLE CONTRROL ALGORITHM FOR CONTROLLING END EFFECTOR PARAMETER;
[0045] [0045] and US Patent Application No. 16 / 182,246, entitled ADJUSTMENTS BASED ON AIRBORNE PARTICLE PROPERTIES;
[0046] [0046] and US Patent Application No. 16 / 182,256, entitled ADJUSTMENT OF A SURGICAL DEVICE FUNCTION BASED ON SITUATI-ONAL AWARENESS;
[0047] [0047] and US Patent Application No. 16 / 182,242, entitled REAL-TIME ANALYSIS OF COMPREHENSIVE COST OF ALL INSTRU- MENTATION USED IN SURGERY UTILIZING DATA FLUIDITY TO TRACK INSTRUMENTS THROUGH STOCKING AND IN-HOUSE PROCESSES;
[0048] [0048] and US Patent Application No. 16 / 182,269, entitled IMAGE CAPTURING OF THE AREAS OUTSIDE THE ABDOMEN TO IM- PROVE PLACEMENT AND CONTROL OF A SURGICAL DEVICE IN USE;
[0049] [0049] and US Patent Application No. 16 / 182,278, entitled COMMUNICATION OF DATA WHERE A SURGICAL NETWORK IS USING CONTEXT OF THE DATA AND REQUIREMENTS OF A RE-CEIVING SYSTEM / USER TO INFLUENCE INCLUSION OR LINKAGE OF DATA AND METADATA TO ESTABLISH CONTINUITY;
[0050] [0050] and US Patent Application No. 16 / 182,290, entitled SURGICAL NETWORK RECOMMENDATIONS FROM REAL TIME ANALYSIS OF PROCEDURE VARIABLES AGAINST A BASELINE HIGHLIGHTING DIFFERENCES FROM THE OPTIMAL SOLUTION;
[0051] [0051] and US Patent Application No. 16 / 182,232, entitled CONTROL OF A SURGICAL SYSTEM THROUGH A SURGICAL BARRIER;
[0052] [0052] and US Patent Application No. 16 / 182,227, entitled SURGICAL NETWORK DETERMINATION OF PRIORITIZATION OF COMMUNICA- TION, INTERACTION, OR PROCESSING BASED ON SYSTEM OR DEVICE NEEDS;
[0053] [0053] and US Patent Application No. 16 / 182,231, entitled WIRE- LESS PAIRING OF A SURGICAL DEVICE WITH ANOTHER DEVICE WITHIN A STERILE SURGICAL FIELD BASED ON THE USAGE AND SITUATIONAL AWARENESS OF DEVICES;
[0054] [0054] and US Patent Application No. 16 / 182,229, entitled ADJUS- TMENT OF STAPLE HEIGHT OF AT LEAST ONE ROW OF STAPLES BASED ON THE SENSED TISSUE THICKNESS OR FORCE IN CLOING;
[0055] [0055] and US Patent Application No. 16 / 182,234, entitled STA- PLING DEVICE WITH BOTH COMPULSORY AND DISCRETIONARY LOCKOUTS BASED ON SENSED PARAMETERS;
[0056] [0056] and US Patent Application No. 16 / 182,240, entitled POWE-RED STAPLING DEVICE CONFIGURED TO ADJUST FORCE, AD-VANCEMENT SPEED, AND OVERALL STROKE OF CUTTING MEMBER BER BASED ON SENSED PARAMETER OF FIRING OR CLAMPING;
[0057] [0057] and US Patent Application No. 16 / 182,235, entitled VARIATION OF RADIO FREQUENCY AND ULTRASONIC POWER LEVEL IN CO-
[0058] [0058] and US Patent Application No. 16 / 182,238, entitled ULTRA- SONIC ENERGY DEVICE WHICH VARIES PRESSURE APPLIED BY CLAMP ARM TO PROVIDE THRESHOLD CONTROL PRESSURE AT A CUT PROGRESSION LOCATION.
[0059] [0059] The applicant for this application holds the following US Patent Applications filed on September 10, 2018, the disclosure of which is incorporated herein by reference in its entirety:
[0060] [0060] and US Provisional Patent Application No. 62 / 729,183, entitled A CONTROL FOR A SURGICAL NETWORK OR SURGICAL NETWORK CONNECTED DEVICE THAT ADJUSTS ITS FUNCTION BASED ON A SENSED SITUATION OR USAGE;
[0061] [0061] and US Provisional Patent Application No. 62 / 729,177, entitled AUTOMATED DATA SCALING, ALIGNMENT, AND ORGANIZING BA- SED ON PREDEFINED PARAMETERS WITHIN A SURGICAL NETWORK BEFORE TRANSMISSION;
[0062] [0062] and US Provisional Patent Application No. 62 / 729,176, entitled INDIRECT COMMAND AND CONTROL OF A FIRST OPERATING ROOM SYSTEM THROUGH THE USE OF A SECOND OPERATING ROOM SYSTEM WITHIN A STERILE FIELD WHERE THE SECOND OPERATING ROOM SYSTEM HAS PRIMARY AND SECONDARY OPERATING MODES;
[0063] [0063] and US Provisional Patent Application No. 62 / 729,185, entitled POWERED STAPLING DEVICE THAT IS CAPABLE OF ADJUSTING FORCE, ADVANCEMENT SPEED, AND OVERALL STROKE OF CUT- TING MEMBER OF THE DEVICE BASED ON SENSED PARAMETER OF FIRING OR CLAMPING;
[0064] [0064] and US Provisional Patent Application No. 62 / 729,184, entitled POWERED SURGICAL TOOL WITH A PREDEFINED ADJUSTABLE CONTROL ALGORITHM FOR CONTROLLING AT LEAST ONE END EFFECTOR PARAMETER AND A MEANS FOR LIMITING THE AD-JUSTMENT;
[0065] [0065] and US Provisional Patent Application No. 62 / 729,182, entitled SENSING THE PATIENT POSITION AND CONTACT UTILIZING THE MONO POLAR RETURN PAD ELECTRODE TO PROVIDE SITUATIONAL AWARENESS TO THE HUB;
[0066] [0066] and US Provisional Patent Application No. 62 / 729,191, entitled SURGICAL NETWORK RECOMMENDATIONS FROM REAL TIME ANALYSIS OF PROCEDURE VARIABLES AGAINST A BASELINE HIGHLIGHTING DIFFERENCES FROM THE OPTIMAL SOLUTION;
[0067] [0067] and US Provisional Patent Application No. 62 / 729,195, entitled ULTRASONIC ENERGY DEVICE WHICH VARIES PRESSURE APPLIED BY CLAMP ARM TO PROVIDE THRESHOLD CONTROL PRESSURE AT A CUT PROGRESSION LOCATION; and
[0068] [0068] and US Provisional Patent Application No. 62 / 729,186, entitled WIRELESS PAIRING OF A SURGICAL DEVICE WITH ANOTHER DEVICE WITHIN A STERILE SURGICAL FIELD BASED ON THE USAGE AND SITUATIONAL AWARENESS OF DEVICES.
[0069] [0069] The applicant of the present application holds the following US Patent Applications, filed on August 28, 2018, the disclosure of which is incorporated herein by reference in its entirety:
[0070] [0070] and US Patent Application No. 16 / 115,214, entitled ESTIMATE- TING STATE OF ULTRASONIC END EFFECTOR AND CONTROL SYSTEM THEREFOR;
[0071] [0071] and Patent Application U, nº 16 / 115,205, entitled TEMPERA- TURE CONTROL OF ULTRASONIC END EFFECTOR AND CONTROL SYSTEM THEREFOR;
[0072] [0072] and US Patent Application No. 16 / 115,233, entitled RADIO FREQUENCY ENERGY DEVICE FOR DELIVERING COMBINED ELECTRICAL SIGNALS;
[0073] [0073] and US Patent Application No. 16 / 115,208, entitled CONTROL- LING AN ULTRASONIC SURGICAL INSTRUMENT ACCORDING TO TISSUE LOCATION;
[0074] [0074] and US Patent Application No. 16 / 115,220, entitled CONTRACTING ACTIVATION OF AN ULTRASONIC SURGICAL INSTRUMENT ACCORDING TO THE PRESENCE OF TISSUE;
[0075] [0075] and US Patent Application No. 16 / 115,232, entitled DETERMI | - NING TISSUE COMPOSITION VIA AN ULTRASONIC SYSTEM;
[0076] [0076] and US Patent Application No. 16 / 115,239, entitled DETER- MINING THE STATE OF AN ULTRASONIC ELECTROMECHANICAL SYSTEM ACCORDING TO FREQUENCY SHIFT;
[0077] [0077] and US Patent Application No. 16 / 115,247, entitled DETERMINING THE STATE OF AN ULTRASONIC END EFFECTOR;
[0078] [0078] and US Patent Application No. 16 / 115,211, entitled SITUATIO-NAL AWARENESS OF ELECTROSURGICAL SYSTEMS;
[0079] [0079] and US Patent Application No. 16 / 115,226, entitled MECHA- NISMS FOR CONTROLLING DIFFERENT ELECTROMECHANICAL SYSTEMS OF AN ELECTROSURGICAL INSTRUMENT;
[0080] [0080] and US Patent Application No. 16 / 115,240, entitled DETECTION OF END EFFECTOR IMMERSION IN LIQUID;
[0081] [0081] and US Patent Application No. 16 / 115,249, entitled INTER- RUPTION OF ENERGY DUE TO INADVERTENT CAPACITIVE COUPLING;
[0082] [0082] and US Patent Application No. 16 / 115,256, entitled INCREA-SING RADIO FREQUENCY TO CREATE PAD-LESS MONOPOLAR LOOP;
[0083] [0083] and US Patent Application No. 16 / 115,223, entitled BIPO-
[0084] [0084] and US Patent Application No. 16 / 115,238, entitled ACTIVATION OF ENERGY DEVICES.
[0085] [0085] The applicant for the present application holds the following US Patent Applications filed on August 23, 2018, the disclosure of which is incorporated herein by reference in its entirety:
[0086] [0086] and US Provisional Patent Application No. 62 / 721,995, entitled CONTROLLING AN ULTRASONIC SURGICAL INSTRUMENT ACCORDING TO TISSUE LOCATION;
[0087] [0087] and US Provisional Patent Application No. 62 / 721,998, entitled SITUATIONAL AWARENESS OF ELECTROSURGICAL SYSTEMS;
[0088] [0088] and US Provisional Patent Application No. 62 / 721,999, entitled INTERRUPTION OF ENERGY DUE TO INADVERTENT CAPACITIVE COUPLING;
[0089] [0089] and US Provisional Patent Application No. 62 / 721,994, entitled BIPOLAR COMBINATION DEVICE THAT AUTOMATICALLY AD- JUSTS PRESSURE BASED ON ENERGY MODALITY; and
[0090] [0090] and US Provisional Patent Application No. 62 / 721,996, entitled RADIO FREQUENCY ENERGY DEVICE FOR DELIVERING COMBINED ELECTRICAL SIGNALS.
[0091] [0091] The applicant of the present application holds the following US Patent Applications, filed on June 30, 2018, the disclosure of which is incorporated herein by reference in its entirety:
[0092] [0092] and US Provisional Patent Application No. 62 / 692,747, entitled SMART ACTIVATION OF AN ENERGY DEVICE BY ANOTHER DEVI-CE;
[0093] [0093] and US Provisional Patent Application No. 62 / 692,748, entitled SMART ENERGY ARCHITECTURE; and
[0094] [0094] and US Provisional Patent Application No. 62 / 692,768, entitled SMART ENERGY DEVICES.
[0095] [0095] The applicant of the present application holds the following US Patent Applications, filed on June 29, 2018, the disclosure of which is incorporated herein by reference in its entirety:
[0096] [0096] and US Patent Application Serial No. 16 / 024,090, entitled CAPACITIVE COUPLED RETURN PATH PAD WITH SEPARABLE ARRAY ELEMENTS;
[0097] [0097] and US Patent Application Serial No. 16 / 024,057, entitled CONTROLLING A SURGICAL INSTRUMENT ACCORDING TO SENTED CLOSURE PARAMETERS;
[0098] [0098] and US Patent Application Serial No. 16 / 024,067, entitled SYSTEMS FOR ADJUSTING END EFFECTOR PARAMETERS BASED ON PERIOPERATIVE INFORMATION;
[0099] [0099] and US Patent Application Serial No. 16 / 024,075, entitled SAFETY SYSTEMS FOR SMART POWERED SURGICAL STAPLING;
[00100] [00100] and US Patent Application Serial No. 16 / 024,083, entitled SAFETY SYSTEMS FOR SMART POWERED SURGICAL STAPLING;
[00101] [00101] and US Patent Application Serial No. 16 / 024,094, entitled SURGICAL SYSTEMS FOR DETECTING END EFFECTOR TISSUE DISTRIBUTION IRREGULARITIES;
[00102] [00102] and US Patent Application Serial No. 16 / 024,138, entitled SYSTEMS FOR DETECTING PROXIMITY OF SURGICAL END EF- FECTOR TO CANCEROUS TISSUE;
[00103] [00103] and US Patent Application Serial No. 16 / 024,150, entitled SURGICAL INSTRUMENT CARTRIDGE SENSOR ASSEMBLIES;
[00104] [00104] and US Patent Application Serial No. 16 / 024,160, entitled VARIABLE OUTPUT CARTRIDGE SENSOR ASSEMBLY;
[00105] [00105] and US Patent Application Serial No. 16 / 024.124, entitled SURGICAL INSTRUMENT HAVING A FLEXIBLE ELECTRODE;
[00106] [00106] and US Patent Application Serial No. 16 / 024,132, entitled SURGICAL INSTRUMENT HAVING A FLEXIBLE CIRCUIT;
[00107] [00107] and US Patent Application Serial No. 16 / 024,141, entitled SURGICAL INSTRUMENT WITH A TISSUE MARKING ASSEMBLY;
[00108] [00108] and US Patent Application Serial No. 16 / 024,162, entitled SURGICAL SYSTEMS WITH PRIORITIZED DATA TRANSMISSION CAPABILITIES;
[00109] [00109] and US Patent Application Serial No. 16 / 024,066, entitled SURGICAL EVACUATION SENSING AND MOTOR CONTROL;
[00110] [00110] and US Patent Application Serial No. 16 / 024,096, entitled SURGICAL EVACUATION SENSOR ARRANGEMENTS;
[00111] [00111] and US Patent Application Serial No. 16 / 024,116, entitled SURGICAL EVACUATION FLOW PATHS;
[00112] [00112] and US Patent Application Serial No. 16 / 024,149, entitled SURGICAL EVACUATION SENSING AND GENERATOR CONTROL;
[00113] [00113] and US Patent Application Serial No. 16 / 024,180, entitled SURGICAL EVACUATION SENSING AND DISPLAY;
[00114] [00114] and US Patent Application Serial No. 16 / 024,245, entitled COMMUNICATION OF SMOKE EVACUATION SYSTEM PARAME- TERS TO HUB OR CLOUD IN SMOKE EVACUATION MODULE FOR INTERACTIVE SURGICAL PLATFORM;
[00115] [00115] and US Patent Application Serial No. 16 / 024,258, entitled SMOKE EVACUATION SYSTEM INCLUDING A SEGMENTED CONTRROL CIRCUIT FOR INTERACTIVE SURGICAL PLATFORM;
[00116] [00116] and US Patent Application Serial No. 16 / 024,265, entitled SURGICAL EVACUATION SYSTEM WITH A COMMUNICATION
[00117] [00117] and US Patent Application Serial No. 16 / 024,273, entitled DUAL IN-SERIES LARGE AND SMALL DROPLET FILTERS.
[00118] [00118] The applicant for the present application holds the following provisional US Patent Applications, filed on June 28, 2018, with the disclosure of each of which is incorporated herein by reference in its entirety:
[00119] [00119] and US Provisional Patent Application serial number 62 / 691,228, entitled A METHOD OF USING REINFORCED FLEX CIRCUITS WITH MULTIPLE SENSORS WITH ELECTROSURGICAL DEVICES;
[00120] [00120] and US Provisional Patent Application serial number 62 / 691,227, entitled CONTROLLING A SURGICAL INSTRUMENT ACCORDING TO SENSED CLOSURE PARAMETERS;
[00121] [00121] and US Provisional Patent Application serial number 62 / 691,230, entitled SURGICAL INSTRUMENT HAVING A FLEXIBLE ELECTRO- DE;
[00122] [00122] and US Provisional Patent Application serial number 62 / 691,219, entitled SURGICAL EVACUATION SENSING AND MOTOR CONTRROL;
[00123] [00123] and US Provisional Patent Application serial number 62 / 691,257, entitled COMMUNICATION OF SMOKE EVACUATION SYSTEM PARAMETERS TO HUB OR CLOUD IN SMOKE EVACUATION MODULE FOR INTERACTIVE SURGICAL PLATFORM;
[00124] [00124] and US Provisional Patent Application serial number 62 / 691,262, entitled SURGICAL EVACUATION SYSTEM WITH A COMMUNI-
[00125] [00125] and US Provisional Patent Application serial number 62 / 691,251, entitled DUAL IN-SERIES LARGE AND SMALL DROPLET FILTERS.
[00126] [00126] The applicant for this application holds the following provisional US Patent Applications, filed on April 19, 2018, with the disclosure of each of which is incorporated herein by reference, in its entirety:
[00127] [00127] and US Provisional Patent Application serial number 62 / 659,900, entitled METHOD OF HUB COMMUNICATION.
[00128] [00128] The applicant for this application holds the following provisional US Patent Applications, filed on March 30, 2018, with the disclosure of each of which is incorporated herein by reference in its entirety:
[00129] [00129] and US Provisional Patent Application serial number 62 / 650,898, filed on March 30, 2018, entitled CAPACITIVE COU- PLED RETURN PATH PAD WITH SEPARABLE ARRAY ELEMENTS;
[00130] [00130] and US Provisional Patent Application serial number 62 / 650,887, entitled SURGICAL SYSTEMS WITH OPTIMIZED SENSING CAPABILITIES;
[00131] [00131] and US Provisional Patent Application serial number 62 / 650,882, entitled SMOKE EVACUATION MODULE FOR INTERACTIVE SURGICAL PLATFORM; and
[00132] [00132] and US Provisional Patent Application serial number 62 / 650,877, entitled SURGICAL SMOKE EVACUATION SENSING AND CONTRROLS.
[00133] [00133] The applicant for this application holds the following US Patent Applications, filed on March 29, 2018, the disclosure of which is incorporated herein by reference in its entirety:
[00134] [00134] and US Patent Application Serial No. 15 / 940,641, entitled INTERACTIVE SURGICAL SYSTEMS WITH ENCRYPTED COMMUNTI- CATION CAPABILITIES;
[00135] [00135] and US Patent Application Serial No. 15 / 940,648, entitled IN- TERACTIVE SURGICAL SYSTEMS WITH CONDITION HANDLING OF DEVICES AND DATA CAPABILITIES;
[00136] [00136] and US Patent Application Serial No. 15 / 940,656, entitled SURGICAL HUB COORDINATION OF CONTROL AND COMMUNICATION OF OPERATING ROOM DEVICES;
[00137] [00137] and US Patent Application Serial No. 15 / 940,666, entitled SPATIAL AWARENESS OF SURGICAL HUBS IN OPERATING RO-WHO;
[00138] [00138] and US Patent Application Serial No. 15 / 940,670, entitled COOPERATIVE UTILIZATION OF DATA DERIVED FROM SECON- DARY SOURCES BY INTELLIGENT SURGICAL HUBS;
[00139] [00139] and US Patent Application Serial No. 15 / 940,677, entitled SURGICAL HUB CONTROL ARRANGEMENTS;
[00140] [00140] and US Patent Application Serial No. 15 / 940,632, entitled DATA STRIPPING METHOD TO INTERROGATE PATIENT RE-CORDS AND CREATE ANONYMIZED RECORD;
[00141] [00141] and US Patent Application Serial No. 15 / 940,640, entitled COMMUNICATION HUB AND STORAGE DEVICE FOR STORING PARAMETERS AND STATUS OF A SURGICAL DEVICE TO BE SHARED WITH CLOUD BASED ANALYTICS SYSTEMS;
[00142] [00142] and US Patent Application Serial No. 15 / 940,645, entitled SELF DESCRIBING DATA PACKETS GENERATED AT AN ISSUING INSTRUMENT;
[00143] [00143] and US Patent Application Serial No. 15 / 940,649, entitled DATA PAIRING TO INTERCONNECT A DEVICE MEASURED PA-RAMETER WITH AN OUTCOME;
[00144] [00144] and US Patent Application Serial No. 15 / 940,654, entitled SURGICAL HUB SITUATIONAL AWARENESS;
[00145] [00145] and US Patent Application Serial No. 15 / 940,663, entitled SURGICAL SYSTEM DISTRIBUTED PROCESSING;
[00146] [00146] and US Patent Application Serial No. 15 / 940,668, entitled AGGREGATION AND REPORTING OF SURGICAL HUB DATA;
[00147] [00147] and US Patent Application Serial No. 15 / 940,671, entitled SURGICAL HUB SPATIAL AWARENESS TO DETERMINE DEVICES IN OPERATING THEATER;
[00148] [00148] and US Patent Application Serial No. 15 / 940,686, entitled DISPLAY OF ALIGNMENT OF STAPLE CARTRIDGE TO PRIOR LI-NEAR STAPLE LINE;
[00149] [00149] and US Patent Application Serial No. 15 / 940,700, entitled STERILE FIELD INTERACTIVE CONTROL DISPLAYS;
[00150] [00150] and US Patent Application Serial No. 15 / 940,629, entitled COMPUTER IMPLEMENTED INTERACTIVE SURGICAL SYSTEMS;
[00151] [00151] and US Patent Application Serial No. 15 / 940,704, entitled USE OF LASER LIGHT AND RED-GREEN-BLUE COLORATION TO DETERMINE PROPERTIES OF BACK SCATTERED LIGHT;
[00152] [00152] and US Patent Application Serial No. 15 / 940,722, entitled CHARACTERIZATION OF TISSUE IRREGULARITIES THROUGH THE USE OF MONO-CHROMATIC LIGHT REFRACTIVITY;
[00153] [00153] and US Patent Application Serial No. 15 / 940,742, entitled DUAL CMOS ARRAY IMAGING.
[00154] [00154] and US Patent Application Serial No. 15 / 940,636, entitled ADAPTIVE CONTROL PROGRAM UPDATES FOR SURGICAL DEVICES;
[00155] [00155] and US Patent Application Serial No. 15 / 940,653, entitled ADAPTIVE CONTROL PROGRAM UPDATES FOR SURGICAL HUBS;
[00156] [00156] and US Patent Application Serial No. 15 / 940,660, entitled CLOUD-BASED MEDICAL ANALYTICS FOR CUSTOMIZATION AND RECOMMENDATIONS TO A USER;
[00157] [00157] and US Patent Application Serial No. 15 / 940,679, entitled CLOUD-BASED MEDICAL ANALYTICS FOR LINKING OF LOCAL USAGE TRENDS WITH THE RESOURCE ACQUISITION BEHA-VIORS OF LARGER DATA SET;
[00158] [00158] and US Patent Application Serial No. 15 / 940,694, entitled CLOUD-BASED MEDICAL ANALYTICS FOR MEDICAL FACILITY SEGMENTED INDIVIDUALIZATION OF INSTRUMENT FUNCTION;
[00159] [00159] and US Patent Application Serial No. 15 / 940,634, entitled CLOUD-BASED MEDICAL ANALYTICS FOR SECURITY AND AU-THENTICATION TRENDS AND REACTIVE MEASURES;
[00160] [00160] and US Patent Application Serial No. 15 / 940,706, entitled DATA HANDLING AND PRIORITIZATION IN A CLOUD ANALYTICS NETWORK;
[00161] [00161] and US Patent Application Serial No. 15 / 940,675, entitled CLOUD INTERFACE FOR COUPLED SURGICAL DEVICES;
[00162] [00162] and US Patent Application Serial No. 15 / 940,627, entitled DRIVE ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLAT-FORMS;
[00163] [00163] and US Patent Application Serial No. 15 / 940,637, entitled COMMUNICATION ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS;
[00164] [00164] and US Patent Application Serial No. 15 / 940,642, entitled CONTROLS FOR ROBOT-ASSISTED SURGICAL PLATFORMS;
[00165] [00165] and US Patent Application Serial No. 15 / 940,676, entitled AUTOMATIC TOOL ADJUSTMENTS FOR ROBOT-ASSISTED SURGI-CAL PLATFORMS;
[00166] [00166] and US Patent Application Serial No. 15 / 940,680, entitled CONTROLLERS FOR ROBOT-ASSISTED SURGICAL PLATFORMS;
[00167] [00167] and US Patent Application Serial No. 15 / 940,683, entitled COOPERATIVE SURGICAL ACTIONS FOR ROBOT-ASSISTED SURGICAL PLATFORMS;
[00168] [00168] and US Patent Application Serial No. 15 / 940,690, entitled DISPLAY. ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; and
[00169] [00169] and US Patent Application Serial No. 15 / 940,711, entitled SENSING ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS.
[00170] [00170] The applicant for this application holds the following provisional US Patent Applications, filed on March 28, 2018, with the disclosure of each of which is incorporated herein by reference in its entirety:
[00171] [00171] and US Provisional Patent Application No. 62 / 649,302, entitled INTERACTIVE SURGICAL SYSTEMS WITH ENCRYPTED COMMUNICATION CAPABILITIES;
[00172] [00172] and US Provisional Patent Application serial number 62 / 649,294, entitled DATA STRIPPING METHOD TO INTERROGATE PATIENT RECORDS AND CREATE ANONYMIZED RECORD;
[00173] [00173] and US Provisional Patent Application serial number 62 / 649,300, entitled SURGICAL HUB SITUATIONAL AWARENESS;
[00174] [00174] and US Provisional Patent Application Serial No. 62 / 649,309, entitled SURGICAL HUB SPATIAL AWARENESS TO DETERMINE DEVICES IN OPERATING THEATER;
[00175] [00175] and US Provisional Patent Application No. 62 / 649,310, entitled COMPUTER IMPLEMENTED INTERACTIVE SURGICAL SYSTEMS;
[00176] [00176] and US Provisional Patent Application No. 62 / 649,291, entitled USE OF LASER LIGHT AND RED-GREEN-BLUE COLORATION TO DETERMINE PROPERTIES OF BACK SCATTERED LIGHT;
[00177] [00177] and US Provisional Patent Application No. 62 / 649,296, entitled ADAPTIVE CONTROL PROGRAM UPDATES FOR SURGICAL DEVICES;
[00178] [00178] and US Provisional Patent Application serial number 62 / 649,333, entitled CLOUD-BASED MEDICAL ANALYTICS FOR CUSTOMIZATI- ON AND RECOMMENDATIONS TO A USER;
[00179] [00179] and US Provisional Patent Application serial number 62 / 649,327, entitled CLOUD-BASED MEDICAL ANALYTICS FOR SECURITY AND AUTHENTICATION TRENDS AND REACTIVE MEASURES;
[00180] [00180] and provisional US Patent Application No. 62 / 649,315, entitled DATA HANDLING AND PRIORITIZATION IN A CLOUD ANALYTICS NETWORK;
[00181] [00181] and US Provisional Patent Application serial number 62 / 649,313, entitled CLOUD INTERFACE FOR COUPLED SURGICAL DEVICES;
[00182] [00182] and US Provisional Patent Application No. 62 / 649,320, entitled DRIVE ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLAT-FORMS;
[00183] [00183] and US Provisional Patent Application serial number 62 / 649,307, entitled AUTOMATIC TOOL ADJUSTMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; and
[00184] [00184] and US Provisional Patent Application No. 62 / 649,323, entitled SENSING ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS.
[00185] [00185] The applicant for the present application holds the following provisional US Patent Applications, filed on March 8, 2018, the disclosure of which is incorporated herein by reference in its entirety:
[00186] [00186] and US Provisional Patent Application serial number 62 / 640,417, entitled TEMPERATURE CONTROL IN ULTRASONIC DEVICE AND CONTROL SYSTEM THEREFOR; and
[00187] [00187] and US Provisional Patent Application serial number 62 / 640,415, entitled ESTIMATING STATE OF ULTRASONIC END EFFECTOR AND CONTROL SYSTEM THEREFOR.
[00188] [00188] The applicant for the present application holds the following provisional US Patent Applications, filed on December 28, 2017, the disclosure of which is incorporated herein by reference in its entirety:
[00189] [00189] and US Provisional Patent Application serial number 62 / 611,341, entitled INTERACTIVE SURGICAL PLATFORM;
[00190] [00190] and US Provisional Patent Application serial number 62 / 611,340, entitled CLOUD-BASED MEDICAL ANALYTICS; and
[00191] [00191] and US Provisional Patent Application serial number 62 / 611,339, entitled ROBOT ASSISTED SURGICAL PLATFORM.
[00192] [00192] Before explaining in detail the various aspects of surgical instruments and generators, it should be noted that the illustrative examples are not limited, in terms of application or use, to the details of construction and arrangement of parts illustrated in the descriptions
[00193] [00193] Referring to Figure 1, a computer-implemented interactive surgical system 100 includes one or more surgical systems 102 and a cloud-based system (for example, cloud 104 which may include a remote server 113 coupled to a device storage 105). Each surgical system 102 includes at least one central surgical controller 106 in communication with the cloud 104 which can include a remote server 113. In one example, as illustrated in Figure 1, surgical system 102 includes a display system 108, a robotic system 110, a smart handheld surgical instrument 112, which are configured to communicate with each other and / or the central controller 106. In some respects, a surgical system 102 may include a number of central controllers M 106 , an N number of visualization systems 108, an O number of robotic systems 110, and a P number of smart, hand-held surgical instruments 112, where M, N, O, and P are integers greater than or equal the one.
[00194] [00194] Figure 2 represents an example of a surgical system 102 being used to perform a surgical procedure on a patient who is lying on an operating table 114 in a surgical operating room 116. A robotic system 110 is used in the surgical procedure as a part of surgical system 102. Robotic system 110 includes a surgeon console 118, patient car 120 (surgical robot), and a robotic central surgical controller 122. Patient car 120 can handle at least one tool - surgical tool removably coupled 117 through a minimally invasive incision in the patient's body while the surgeon views the surgical site through the surgeon's console 118. An image of the surgical site can be obtained by a medical imaging device 124, which can be manipulated by patient car 120 to guide the imaging device 124. The robotic central surgical controller 122 can be used to process the images of the surgical site for subsequent display to the surgeon through the surgeon's console 118.
[00195] [00195] Other types of robotic systems can be readily adapted for use with the surgical system 102. Various examples of robotic systems and surgical instruments that are suitable for use with the present disclosure are described in Provisional Patent Application No. serial 62 / 611,339, entitled ROBOT ASSISTED SURGICAL PLATFORM, filed on December 28, 2017, the disclosure of which is incorporated herein by reference in its entirety.
[00196] [00196] Several examples of cloud-based analysis that are performed by cloud 104, and are suitable for use with the present disclosure, are described in US Provisional Patent Application Serial No. 62 / 611.340, entitled CLOUD-BASED MEDICAL ANALYTICS, deposited on December 28, 2017, the disclosure of which is incorporated herein by reference, in its entirety.
[00197] [00197] In several respects, the imaging device 124 includes at least one Image sensor and one or more optical components. Suitable image sensors include, but are not limited to, load-coupled device (CCD) sensors and complementary metal oxide semiconductor (CMOS) sensors.
[00198] [00198] The optical components of the imaging device 124 may include one or more light sources and / or one or more lenses. One or more light sources can be targeted to illuminate portions of the surgical field. The one or more image sensors can receive reflected or refracted light from the surgical field, including reflected or refracted light from the tissue and / or surgical instruments.
[00199] [00199] One or more light sources can be configured to radiate electromagnetic energy in the visible spectrum, as well as in the invisible spectrum. The visible spectrum, sometimes called the optical spectrum or light spectrum, is that portion of the electromagnetic spectrum that is visible to (that is, can be detected by) the human eye and can be called visible light or simply light. A typical human eye will respond to wavelengths in the air that are from about 380 nm to about 750 nm.
[00200] [00200] The invisible spectrum (that is, the non-luminous spectrum) is that portion of the electromagnetic spectrum located below and above the visible spectrum (that is, wavelengths below about 380 nm and above about 750 nm). The invisible spectrum is not detectable by the human eye. Wavelengths greater than about 750 nm are longer than the visible red spectrum, and they become invisible infrared (IR), microwave, radio and electromagnetic radiation. Wavelengths shorter than about 380 nm are shorter than the ultraviolet spectrum, and they become invisible ultraviolet, x-ray, and gamma-ray electromagnetic radiation.
[00201] [00201] In several aspects, the imaging device 124 is con-
[00202] [00202] In one aspect, the imaging device employs multiple spectrum monitoring to discriminate topography and underlying structures. A multi-spectral image is one that captures image data within wavelength bands across the electromagnetic spectrum. The wavelengths can be separated by filters or by using instruments that are sensitive to specific wavelengths, including light from frequencies beyond the visible light range, for example, IR and ultraviolet light. Spectral images can allow the extraction of additional information that the human eye cannot capture with its receivers for the colors red, green, and blue. The use of multi-spectral imaging is described in more detail under the heading "Advanced Imaging Acquisition Module" in US Provisional Patent Application Serial No. 62 / 611,341, entitled INTERACTIVE SURGICAL PLATFORM, filed on December 28, 2017, whose revelation is here incorporated as a reference in its entirety. Multispectral monitoring can be a useful tool for relocating a surgical field after a surgical task is completed to perform one or more of the tests previously described on the treated tissue.
[00203] [00203] It is axiomatic that strict sterilization of the operating room and surgical equipment is necessary during any surgery. The strict hygiene and sterilization conditions required in an "operating room", that is, an operating or treatment room, justify the highest possible sterilization of all medical devices and equipment. Part of this sterilization process is the need to sterilize anything that comes into contact with the patient or enters the sterile field, including imaging device 124 and its connectors and components. It will be understood that the sterile field can be considered a specified area, such as inside a tray or on a sterile towel, which is considered free of microorganisms, or the sterile field can be considered an area, immediately around a patient, who was prepared to perform a surgical procedure. The sterile field may include members of the brushing team, who are properly dressed, and all furniture and accessories in the area.
[00204] [00204] In various aspects, the visualization system 108 includes one or more imaging sensors, one or more image processing units, one or more storage arrays and one or more screens that are strategically arranged in relation to the field sterile, as shown in Figure 2. In one aspect, the visualization system 108 includes an interface for HL7, PACS and EMR. Various components of the 108 visualization system are described under the heading "Advanced Imaging Acquisition Module" in US Provisional Patent Application Serial No. 62 / 611,341, entitled INTERACTIVE SURGICAL PLATFORM, filed on December 28, 2017, whose disclosure it is hereby incorporated by reference in its entirety.
[00205] [00205] As shown in Figure 2, a primary screen 119 is positioned in the sterile field to be visible to the operator on the operating table 114. In addition, a viewing tower 111 is positioned outside the sterile field. The display tower 111 includes a first non-sterile screen 107 and a second non-sterile screen 109, which are opposite each other. The visualization system 108, guided by the central controller 106, is configured to use screens 107,
[00206] [00206] In one aspect, the central controller 106 is also configured to route an input or diagnostic feedback by a non-sterile operator in the viewing tower 111 to the primary screen 119 within the sterile field, where it can be seen by a sterile operator on the operating table. In one example, the input may be in the form of a modification of the snapshot displayed on the non-sterile screen 107 or 109, which can be routed to the main screen 119 by the central controller 106.
[00207] [00207] With reference to Figure 2, a surgical instrument 112 is being used in the surgical procedure as part of the surgical system 102. The central controller 106 is also configured to coordinate the flow of information to a screen of the surgical instrument 112. For For example, the flow of coordinated information is further described in US Provisional Patent Application Serial No. 62 / 611.341, entitled INTERACTIVE SURGICAL PLATFORM, deposited on December 28, 2017, the content of which is incorporated here as reference, in its entirety. An entry or diagnostic feedback inserted by a non-sterile operator in the viewing tower 111 can be routed by the central controller 106 to the screen of the surgical instrument 115 in the sterile field, where it can be seen by the operator of the surgical instrument 112. Instruments exemplary surgical instruments that are suitable for use with surgical system 102 are described under the title "Hardware of Surgical Instruments" in US Provisional Patent Application Serial No. 62 / 611,341, entitled INTERACTIVE SURGICAL PLATFORM, filed on December 28, 2017 , whose disclosure is incorporated here by way of reference, in its entirety, for example.
[00208] [00208] - Now with reference to Figure 3, a central controller 106 is shown in communication with a visualization system 108, a robotic system 110 and an intelligent hand-held surgical instrument 112. The central controller 106 includes a controller screen central 135, an imaging module 138, a generator module 140 (which may include a monopolar generator 142, a bipolar generator 144 and / or an ultrasonic generator 143), a communication module 130, a processor module 132 and a storage matrix 134. In certain respects, as shown in Figure 3, the central controller 106 additionally includes a smoke evacuation module 126, a suction / irrigation module 128 and / or a mapping module OR 133.
[00209] [00209] During a surgical procedure, the application of energy to the tissue, for sealing and / or cutting, is generally associated with the evacuation of smoke, suction of excess fluid and / or irrigation of the tissue. Fluid, power, and / or data lines from different sources are often intertwined during the surgical procedure. Valuable time can be wasted in addressing this issue during a surgical procedure. To untangle the lines, it may be necessary to disconnect the lines from their respective modules, which may require a restart of the modules. The modular housing of the central controller 136 offers a unified environment for managing power, data and fluid lines, which reduces the frequency of entanglement between such lines.
[00210] [00210] Aspects of the present revelation present a constraint
[00211] [00211] In one aspect, the fluid line is a first fluid line and a second fluid line extends from the remote surgical site to a suction and irrigation module received slidingly in the central controller housing. In one aspect, the central controller housing comprises a fluid interface.
[00212] [00212] Certain surgical procedures may require the application of more than one type of energy to the tissue. One type of energy may be more beneficial for cutting the fabric, while another type of energy may be more beneficial for sealing the fabric. For example, a bipolar generator can be used to seal the tissue while an ultrasonic generator can be used to cut the sealed tissue. Aspects of the present disclosure present a solution in which a modular housing of the central controller 136 is configured to accommodate different generators and facilitate interactive communication between them. One of the advantages of the central modular housing 136 is that it allows quick removal and / or replacement of several modules.
[00213] [00213] Aspects of the present disclosure feature a modular surgical wrap for use in a surgical procedure that involves applying energy to the tissue. The modular surgical housing includes a first energy generator module, configured to generate a first energy for application to the tissue, and a first docking station that comprises a first docking port that includes first data and energy contacts, being that the first power generator module is slidingly movable in an electrical coupling with the power and data contacts and the first power generator module is slidingly movable out of the electric coupling with the first power contacts - frequency and data.
[00214] [00214] In addition to the above, the modular surgical enclosure also includes a second energy generator module configured to generate a second energy, different from the first energy, for application to the tissue, and a second docking station that it comprises a second coupling port that includes second data and power contacts, the second power generator module being slidably movable in an electrical coupling with the power and data contacts, and the second module The power generator is slidably movable out of the electrical coupling with the second power and data contacts.
[00215] [00215] In addition, the modular surgical cabinet also includes a communication bus between the first coupling port and the second coupling port, configured to facilitate communication between the first power generator module and the second generator module power.
[00216] [00216] With reference to Figures 3 to 7, aspects of the present disclosure are presented for a modular housing of the central controller 136 that allows the modular integration of a generator module 140, a smoke evacuation module 126, and a suction / irrigation module 128. The central modular housing 136 further facilitates interactive communication between modules 140, 126, 128. As shown in Figure 5, generator module 140 can be a generator module with monopoly components, integrated bipolar and ultrasonic devices, supported in a single cabinet unit 139 slidably insertable in the central modular housing 136. As shown in Figure 5, generator module 140 can be configured to connect to a monopolar device 146, to a device bipolar 147 and an ultrasonic device 148. Alternatively, generator module 140 may comprise a series of monopolar, bipolar and / or ultrasonic generator modules that interact through the central modular housing
[00217] [00217] In one aspect, the central modular housing 136 comprises a modular power and a rear communication board 149 with external and wireless communication heads to allow removable fixing of modules 140, 126, 128 and interactive communication between them.
[00218] [00218] In one aspect, the central modular housing 136 includes docking stations, or drawers, 151, here also called drawers, which are configured to receive modules 140, 126, 128 in a sliding manner. Figure 4 illustrates a partial perspective view of a central surgical controller housing 136, and a combined generator module 145 slidably received at a docking station 151 of the central surgical controller housing 136. A docking port 152 with power and contacts data on a rear side of the combined generator module 145 is configured to engage a corresponding docking port 150 with the power and data contacts of a corresponding docking station 151 of the modular housing of the central controller 136 depending on the generator module combined 145 is slid into position at the corresponding docking station 151 of the modular housing of the central controller 136. In one aspect, the combined generator module 145 includes a bipolar, ultrasonic and monopolar module and a smoke evacuation module integrated into a single 139 compartment unit, as shown in Figure 5.
[00219] [00219] In several respects, the smoke evacuation module 126 includes a fluid line 154 that transports fluid captured / collected smoke away from a surgical site and to, for example, the smoke evacuation module 126. The vacuum suction that originates from the smoke evacuation module 126 can pull the smoke into an opening of a utility conduit at the surgical site. The utility conduit, coupled to the fluid line, can be in the form of a flexible tube that ends in the smoke evacuation module 126. The utility conduit and the fluid line define a fluid path that extends across towards the smoke evacuation module 126 which is received in the central controller housing 136.
[00220] [00220] In several aspects, the suction / irrigation module 128 is coupled to a surgical tool comprising a fluid suction line and a fluid suction line. In one example, the suction and suction fluid lines are in the form of flexible tubes that extend from the surgical site towards the suction / irrigation module 128. One or more drive systems can be configured to cause irrigation and aspiration of fluids to and from the surgical site.
[00221] [00221] In one aspect, the surgical tool includes a drive shaft that has an end actuator at a distal end of it and at least an energy treatment associated with the end actuator, a suction tube, and a irrigation pipe. The suction tube can have an inlet port at a distal end of it and the suction tube extends through the drive shaft. Similarly, an irrigation pipe can extend through the drive shaft and may have an inlet port close to the power application implement. The energy application implement is configured to supply ultrasonic and / or RF energy to the surgical site and is coupled to the generator module 140 by a cable that initially extends through the drive shaft.
[00222] [00222] The irrigation tube can be in fluid communication with a fluid source, and the suction tube can be in fluid communication with a vacuum source. The fluid source and / or the vacuum source can be housed in the suction / irrigation module 128. In one example, the fluid source and / or the vacuum source can be housed in the central controller housing 136 separately from the control module. suction / irrigation 128. In such an example, a fluid interface can be configured to connect the suction / irrigation module 128 to the fluid source and / or the vacuum source.
[00223] [00223] In one aspect, modules 140, 126, 128 and / or their corresponding docking stations in the central modular housing 136 may include alignment features that are configured to align the docking ports of the modules in engagement with their counterparts in the docking stations of the central modular housing 136. For example, as shown in Figure 4, the module generates
[00224] [00224] In some respects, the drawers 151 of the central modular housing 136 are the same, or substantially the same size, and the modules are adjusted in size to be received in the drawers
[00225] [00225] In addition, the contacts of a specific module can be switched to engage with the contacts of a specific drawer to avoid the insertion of a module in a drawer with unpaired contacts.
[00226] [00226] As shown in Figure 4, the coupling port 150 of one drawer 151 can be coupled to the coupling port 150 of another drawer 151 via a communication link 157 to facilitate interactive communication between the modules housed in the modular housing. central module 136. The coupling ports 150 of the central modular housing 136 can, alternatively or additionally, facilitate interactive wireless communication between modules housed in the central modular housing 136. Any suitable wireless communication can be used, such as example, Air Titan Bluetooth.
[00227] [00227] Figure 6 illustrates individual power bus connectors for a plurality of side coupling ports of a lateral modular compartment 160 configured to receive a plurality of
[00228] [00228] Figure 7 illustrates a vertical modular cabinet 164 configured to receive a plurality of modules 165 from the central surgical controller 106. Modules 165 are slidably inserted into docking stations, or drawers, 167 of the modular cabinet vertical 164, which includes a rear panel for interconnecting modules 165. Although the drawers 167 of the vertical modular cabinet 164 are arranged vertically, in some cases, a vertical modular cabinet 164 may include drawers that are arranged laterally. In addition, modules 165 can interact with each other through the coupling ports of the vertical modular cabinet
[00229] [00229] In several respects, the imaging module 138 comprises an integrated video processor and a modular light source and is adapted for use with various imaging devices. In one aspect, the imaging device is comprised of a modular compartment that can be mounted with a light source module and a camera module. The compartment can be a disposable compartment. In at least one example, the disposable compartment is removably coupled to a reusable controller, a light source module, and a camera module. The light source module and / or the camera module can be chosen selectively depending on the type of surgical procedure. In one aspect, the camera module comprises a CCD sensor. In another aspect, the camera module comprises a CMOS sensor. In another aspect, the camera module is configured for imaging the scanned beam. Similarly, the light source module can be configured to provide a white light or a different light, depending on the surgical procedure.
[00230] [00230] During a surgical procedure, removing a surgical device from the surgical field and replacing it with another surgical device that includes a different camera or other light source may be inefficient. Temporarily losing sight of the surgical field can lead to undesirable consequences. The imaging device module of the present disclosure is configured to allow the replacement of a light source module or a "midstream" camera module during a surgical procedure, without the need to remove the imaging device from the field. surgical.
[00231] [00231] In one aspect, the imaging device comprises a tubular compartment that includes a plurality of channels. A first channel is configured to receive the Camera module in a sliding way, which can be configured for a snap-fit fit (pressure fit) with the first channel. A second channel is configured to slide the camera module, which can be configured for a snap-fit fit (pressure fit) with the first channel. In another example, the camera module and / or the light source module can be rotated to an end position within their respective channels. A threaded coupling can be used instead of a pressure fitting.
[00232] [00232] In several examples, multiple imaging devices are placed in different positions in the surgical field to provide multiple views. Imaging module 138 can be configured to switch between imaging devices to provide an ideal view. In several respects, the imaging module 138 can be configured to integrate images from different imaging devices.
[00233] [00233] Various image processors and imaging devices suitable for use with the present disclosure are described in US patent No. 7,995,045 entitled COMBINED SBI AND CONVENTIONAL IMAGE PROCESSOR, granted on August 9, 2011 which is incorporated herein as a reference in its entirety. In addition, US patent No. 7,982,776, entitled SBl MOTION ARTIFACT REMOVAL APPARATUS AND METHOD, issued on July 19, 2011, which is incorporated herein by reference in its entirety, describes various systems for removing motion artifacts from the data of image. Such systems can be integrated with the imaging module 138. In addition to these, US Patent Application Publication No. 2011/0306840, entitled CONTROLLABLE MAGNETIC SOURCE TO FIXTURE INTRACORPOREAL APPARA-TUS, published on December 15, 2011, and the Publication of US Patent Application No. 2014/0243597, entitled SYSTEM FOR PERFORMING A MINIMALLY INVASIVE SURGICAL PROCEDURE, published on August 28, 2014, which are each incorporated herein by reference in their entirety .
[00234] [00234] Figure 8 illustrates a surgical data network 201 comprising a modular communication center 203 configured to connect modular devices located in one or more operating rooms of a healthcare facility, or any environment.
[00235] [00235] Modular devices 1a to 1n located in the operating room can be coupled to the modular communication center 203. The central network controller 207 and / or the network switch 209 can be coupled to a network router 211 to connect the devices 1a to 1n to the cloud 204 or to the local computer system 210. Data associated with devices 1a to 1n can be transferred to cloud-based computers through the router for remote data processing and manipulation. Data associated with devices 1a to 1n can also be transferred to the local computer system
[00236] [00236] It will be understood that the surgical data network 201 can be expanded by interconnecting multiple central network controllers 207 and / or multiple network keys 209 with multiple network routers 211. The modular communication center 203 may be contained in a modular control roaster configured to receive multiple devices 1a to 1n / 2a to 2m. The local computer system 210 can also be contained in a modular control tower. The modular communication center 203 is connected to a screen 212 to display the images obtained by some of the devices 1a to 1n / 2a to 2m, for example, during surgical procedures. In several respects, devices 1a to 1n / 2a to 2m can include, for example, several modules such as an imaging module 138 coupled to an endoscope, a generator module 140 coupled to an energy-based surgical device, a module smoke evacuation system 126, a suction / irrigation module 128, a communication module 130, a processor module 132, a storage array 134, a surgical device attached to a screen, and / or a sensor module without contact, among other modular devices that can be connected to the modular communication center 203 of the surgical data network 201.
[00237] [00237] In one aspect, the surgical data network 201 can
[00238] [00238] The application of cloud computer data processing techniques to the data collected by devices 1a to 1n / 2a to 2m, the surgical data network provides better surgical results, reduced costs, and better patient satisfaction.
[00239] [00239] In an implementation, operating room devices 1a to 1n can be connected to the modular communication center 203 via a wired channel or a wireless channel depending on the configuration of devices 1a to 1n on a controller. central network. The central network controller 207 can be implemented, in one aspect, as a LAN transmission device that acts on the physical layer of the open system interconnection model ("OSI" - open system interconnection). The central network controller provides connectivity to devices 1a to 1n located on the same network as the operating room. The central network controller 207 collects data in the form of packets and sends them to the router in half - duplex mode. "The central network controller 207 does not store any media access control / Internet protocol (MAC / IP) to transfer data from the device, only one of the devices 1a to 1n at a time can send data via the central network controller 207. The central network controller 207 has no routing tables or intelligence about where to send information and transmits all network data through each connection and to a remote server 213 (Figure 9) on number 204. The central network controller 207 can detect basic network errors, such as collisions, but have all (admit it) information transmitted to multiple ports of entry can be a security risk and cause strangulation.
[00240] [00240] In another implementation, operating room devices 2a to 2m can be connected to a network switch 209 through a wired or wireless channel. The network key 209 works in the data connection layer of the OSI model. The network switch 209 is a multicast device for connecting devices 2a to 2m located in the same operation center to the network. The network key 209 sends data in frame form to the network router 211 and works in full duplex mode. Multiple devices 2a to 2m can send data at the same time via network key 209. Network key 209 stores and uses MAC addresses of devices 2a to 2m to transfer data.
[00241] [00241] The central network controller 207 and / or the network key 209 are coupled to the network router 211 for a connection to the number 204. The network router 211 works on the network layer of the OSI model. The network router 211 creates a route to transmit data packets received from the central network controller 207 and / or the network key 211 to a computer with cloud resources for future processing and manipulation of the data collected by any of all or all of the devices 1a to 1n / 2a to 2m. The network router 211 can be used to connect two or more different networks located in different locations, such as different operating rooms in the same healthcare facility or different networks located in different operating rooms. operation of the different health service facilities. Network router 211 sends data in packet form to cloud 204 and works in full duplex mode. Multiple devices can send data at the same time. The network router 211 uses IP addresses to transfer data.
[00242] [00242] In one example, the central network controller 207 can be implemented as a central USB controller, which allows multiple USB devices to be connected to a host computer. The USB central controller can expand a single USB port on several levels so that more ports are available to connect the devices to the system's host computer. The central network controller 207 can include wired or wireless capabilities to receive information about a wired channel or a wireless channel. In one aspect, a wireless wireless, broadband, short-range wireless USB communication protocol can be used for communication between devices 1a to 1n and devices 2a to 2m located in the operating room.
[00243] [00243] In other examples, devices in the operating room 1a to 1nN / 2a to 2m can communicate with the modular communication center 203 via standard Bluetooth wireless technology for exchanging data over short distances (using short-wavelength UHF radio waves in the 2.4 to 2.485 GHz ISM band) from fixed and mobile devices and to build personal area networks ("PANs"). In other respects, operating room devices 1a to 1n / 2a to 2m can communicate with the modular communication center 203 through a number of wireless and wired communication standards or protocols, including, but not limited to, not limited to, Wi-Fi (IEEE 802.11 family), WiMAX (IEEE family
[00244] [00244] The modular communication center 203 can serve as a central connection for one or all operating room devices 1a to 1n / 2a to 2m and handles a type of data known as frames. The tables carry the data generated by the devices 1a to 1n / 2a to 2m. When a frame is received by the modular communication center 203, it is amplified and transmitted to the network router 211, which transfers the data to the cloud computing resources using a series of communication standards or protocols. wireless or wired, as described in the present invention.
[00245] [00245] The modular communication center 203 can be used as a standalone device or be connected to compatible central network controllers and network switches to form a larger network. The modular communication center 203 is, in general, easy to install, configure and maintain, making it a good option for network devices 1a to 1n / 2a to 2m from the operating room.
[00246] [00246] Figure 9 illustrates an interactive surgical system implemented by computer 200. The interactive surgical system implemented by computer 200 is similar in many ways to the interactive surgical system, implemented by computer 100. For example, the system surgical, interactive, implemented by computer 200 includes one or more surgical systems 202, which are similar in many respects to surgical systems 102. Each surgical system 202 includes at least one central surgical controller 206 in communication with a cloud 204 that can include a remote server
[00247] [00247] Figure 10 illustrates a central surgical controller 206 comprising a plurality of modules coupled to the modular control tower 236. The modular control tower 236 comprises a modular communication center 203, for example, a network connectivity device, and a computer system 210 for providing local processing, visualization, and imaging, for example. As shown in Figure 10, the modular communication center 203 can be connected in a layered configuration to expand the number of modules (for example, devices) that can be connected to the modular communication center 203 and transfer data associated with modules to computer system 210, cloud computing resources, or both. As shown in Figure 10, each of the central controllers / network switches in the modular communication center 203 includes three downstream ports and one upstream port. The central controller / network switch upstream is connected to a processor to provide a communication connection with the cloud computing resources and a local display 217. Communication with the cloud 204 can be done through a channel of wired or wireless communication.
[00248] [00248] The central surgical controller 206 employs a non-contact sensor module 242 to measure the dimensions of the operating room and generate a map of the operating room using non-contact measuring devices such as laser or ultrasonic. A non-contact sensor module based on ultrasound scans the operating room by transmitting an ultrasound explosion and receiving the echo when it jumps out of the perimeter of the walls of an operating room, as described under the heading Surgical Hub Spatial Awareness Within an Operating Room "in US Provisional Patent Application serial number 62 / 611,341, entitled INTERACTIVE SURGICAL PLATFORM, filed on December 28, 2017, which is hereby incorporated by reference in its entirety, in which the sensor module is configured to determine the size of the operating room and adjust the limits of the pairing distance with Blue tooth A laser-based non-contact sensor module scans the operating room by transmitting pulses of laser light, receiving pulses laser light bulbs that jump from the perimeter walls of the operating room, and comparing the phase of the transmitted pulse to the received pulse to determine the size of the operating room and to adjust to set the Bluetooth pairing distance limits, for example.
[00249] [00249] Computer system 210 comprises a processor 244 and a network interface 245. Processor 244 is coupled to a communication module 247, storage 248, memory 249, non-volatile memory 250, and an input / output interface 251 via a system bus. The system bus can be any of several types of bus structures, including the memory bus or memory controller, a peripheral bus or external bus, and / or a local bus that uses any variety of architectures. available, including, but not limited to, 9-bit bus, industry standard architecture (ISA), Micro-Charm! Architecture (MSA), Extended ISA (EISA), Smart Drive Electronics (IDE), VESA Local Bus (VLB), Peripheral Component Interconnection (PCI), USB, Accelerated Graphics Port (AGP), POCMCIA Bus (International association of memory cards for personal computers, "Personal Computer Memory Card International Association"), Systems interface for small computers (SCSI), or any other proprietary bus.
[00250] [00250] Processor 244 can be any single-core or multi-core processor, such as those known under the trade name ARM Cortex available from Texas Instruments. In one respect, the processor may be a Core Cortex-M4F LM4F230H5QR ARM processor, available from Texas Instruments, for example, which comprises an integrated 256 KB single-cycle flash memory, or other non-volatile memory, up to 40 MHz, a seek-ahead buffer to optimize performance above 40 MHz, a 32 KB single cycle serial random access memory (SRAM), an internal read-only memory (ROM) loaded with the StellarisWareO program, memory only programmable and electrically erasable readout (EEPROM) of 2 KB, one or more pulse width modulation (PWM) modules, one or more analogs of quadrature encoder (QEI) inputs, one or more analog converters for 12 bit digital (ADC) with 12 channels of analog input, details of which are available for the product data sheet.
[00251] [00251] In one aspect, processor 244 may comprise a safety controller comprising two controller-based families, such as TMS570 and RM4x, known under the trade name Hercules ARM Cortex R4, also by Texas Instruments. The safety controller can be configured specifically for IEC 61508 and ISO 26262 safety critical applications, among others, to provide advanced integrated safety features while providing scalable performance, connectivity and memory options.
[00252] [00252] System memory includes volatile and non-volatile memory. The basic input / output system (BIOS), containing the basic routines for transferring information between elements within the computer system, such as during startup, is stored in non-volatile memory. For example, non-volatile memory can include ROM, programmable ROM (PROM), electrically programmable ROM (EPROM), EEPROM or flash memory. Volatile memory includes random access memory (RAM), which acts as an external cache memory. In addition, RAM is available in many forms such as SRAM, Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDR SDRAM), Enhanced SDRAM (ES-DRAM), Synchlink DRAM (SLDRAM), and Direct RAM Rambus RAM (DRRAM).
[00253] [00253] Computer system 210 also includes removable / non-removable, volatile / non-volatile computer storage media, for example disk storage. Disk storage includes, but is not limited to, devices such as a magnetic disk drive, floppy disk drive, tape drive, Jaz driver, Zip driver, LS-60 driver, flash memory card or memory stick ( pen drive). In addition, the storage disc may include storage media separately or in combination with other storage media including, but not limited to, an optical disc drive such as a compact disc ROM (CD-ROM) drive. recordable compact disc (CD-
[00254] [00254] It is to be understood that computer system 210 includes software that acts as an intermediary between users and basic computer resources described in an appropriate operating environment. Such software includes an operating system. The operating system, which can be stored on disk storage, acts to control and allocate computer system resources. System applications benefit from the management capabilities of the operating system through program modules and “program data stored in system memory or on the storage disk. It is to be understood that the various components described in the present invention can be implemented with various operating systems or combinations of operating systems.
[00255] [00255] A user enters commands or information into the computer system 210 through the input device (s) coupled to the I / O interface 251. Input devices include, but are not limited to, a pointing device such as a mouse, trackball, stylus, touchpad, keyboard, microphone, joystick, game pad, satellite board, scanner, TV tuner card, digital camera, digital video camera, web camera, and the like. These and other input devices connect to the processor via the system bus via the interface port (s). The interface ports include, for example, a serial port, a parallel port, a game port and a USB. Output devices use some of the same types of ports as input devices. In this way, for example, a USB port can be used to provide
[00256] [00256] Computer system 210 can operate in a networked environment using logical connections with one or more remote computers, such as cloud computers, or local computers. Remote cloud computers can be a personal computer, server, router, personal network computer, workstation, microprocessor-based device, peer device, or other common network node, and the like, and typically include many or all elements described in relation to the computer system. For the sake of brevity, only one memory storage device is illustrated with the remote computer. Remote computers are logically connected to the computer system via a network interface and then physically connected via a communication connection. The network interface covers communication networks such as local area networks (LANs) and wide area networks (WANs). LAN technologies include fiber distributed data interface (FDDI), copper distributed data interface (CDDI), Ethernet / IEEE 802.3, Token ring / IEEE 802.5 and the like. WAN technologies include, but are not limited to, point-to-point links, circuit switching networks such as integrated service digital networks (ISDN) and variations in the same, packet switching networks and digital subscriber lines (DSL ).
[00257] [00257] In several respects, the computer system 210 of Figure 10, the imaging module 238 and / or display system 208, and / or the processor module 232 of Figures 9 to 10, may comprise a processor of image, image processing engine, media processor, or any specialized digital signal processor (DSP) used for processing digital images. The image processor can employ parallel computing with single multi-data instruction (SIMD) or multiple multi-data instruction (MIMD) technologies to increase speed and efficiency. The digital image processing engine can perform a number of tasks. The image processor can be an integrated circuit system with a multi-core processor architecture.
[00258] [00258] The communication connection (s) refers to the hardware / software used to connect the network interface to the bus. Although the communication connection is shown for illustrative clarity within the computer system, it can also be external to computer system 210. The hardware / software required for connection to the network interface includes, for illustrative purposes only internal and external technologies such as modems, including regular telephone series modems, cable modems and DSL modems, ISDN adapters and Ethernet cards.
[00259] [00259] Figure 11 illustrates a functional block diagram of an aspect of a USB 300 central network controller device, in accordance with at least one aspect of the present disclosure. In the illustrated aspect, the USB 300 central network controller device uses a TUSB2036 integrated circuit central controller available from Texas Instruments. The USB 300 central network controller is a CMOS device that provides a USB transceiver port 302 and up to three USB transceiver ports downstream.
[00260] [00260] The USB 300 central network controller device is implemented with a digital state machine instead of a micro-controller, and no firmware programming is required. Fully compatible USB transceivers are integrated into the circuit for the upstream USB transceiver port 302 and all downstream USB transceiver ports 304, 306, 308. The downstream USB transceiver ports 304, 306, 308 support both full speed as low speed automatically configuring the scan rate according to the speed of the device attached to the doors. The USB 300 central network controller device can be configured in bus powered or self powered mode and includes 312 central power logic to manage power.
[00261] [00261] The USB 300 central network controller device includes a 310 series interface motor (SIE). The SIE 310 is the front end of the USB 300 central network controller hardware and handles most of the protocol described in chapter 8 of the USB specification. The SIE 310 typically comprises signaling down to the level of the transaction. The functions it handles could include: packet recognition, transaction sequencing, SOP, EOP, RESET, and RESUME signal detection / generation, clock / data separation, data encoding / decoding non-inverted zero ( NRZI), generation and verification of CRC (token and data), generation and
[00262] [00262] In several aspects, the central network controller USB 300 can connect 127 functions configured in up to six logical layers (levels) to a single computer. In addition, the USB 300 central network controller can connect all peripherals using a standardized four-wire cable that provides both communication and power distribution. The power settings are bus-powered and self-powered modes. The USB 300 central network controller can be configured to support four power management modes: a bus-powered central controller, with individual port power management or grouped port power management, and the self-powered central controller, with individual door power management or grouped door power management. In one aspect, using a USB cable, the USB 300 central network controller, the USB transceiver port 302 is plugged into a USB host controller, and the USB transceiver ports downstream 304, 306 , 308 are exposed to connect compatible USB devices, and so on.
[00263] [00263] Additional details regarding the structure and function of the central surgical controller and / or networks of central surgical controllers can be found in US Provisional Patent Application No. 62 / 659,900, entitled METHOD OF HUB COMMUNICATION, filed April 19 2018, which is incorporated by reference in its entirety.
[00264] [00264] Figure 12 is a block diagram of the interactive surgical system implemented by computer, according to at least one aspect of the present disclosure. In one aspect, the computer-implemented interactive surgical system is configured to monitor and analyze data related to the operation of various surgical systems that include central surgical controllers, surgical instruments, robotic devices, and operating rooms or service facilities. Cheers. The interactive surgical system implemented by computer comprises a cloud-based data analysis system. Although the cloud-based data analysis system is described as a surgical system, it is not necessarily limited with such and could be a cloud-based medical system in general. As illustrated in Figure 12, the cloud-based data analysis system comprises a plurality of surgical instruments 7012 (may be the same or similar to instruments 112), a plurality of central surgical controllers 7006 (may be the same or similar to controllers central 106) and a surgical data network 7001 (can be the same or similar to network 201) to couple the central surgical controllers 7006 to cloud 7004 (can be the same or similar to cloud 204). Each of the plurality of central surgical controllers 7006 is communicatively coupled to one or more surgical instruments 7012. Central controllers 7006 are also connected in a communicable way to the cloud 7004 of the interactive surgical system implemented by computer over the 7001 network. The 7004 cloud is a remote centralized source of hardware and software for storage
[00265] [00265] In addition, surgical instruments 7012 can comprise transceivers for transmitting data to and from their corresponding central surgical controllers 7006 (which can also comprise transceivers). Combinations of surgical instruments 7012 and corresponding central controllers 7006 can indicate specific locations, such as operating rooms in health posts (for example, hospitals), to provide medical operations. For example, the memory of a central surgical controller 7006 can store location data. As shown in Figure 12, cloud 7004 comprises central servers 7013 (which can be the same or similar to remote server 113 in Figure 1 and / or remote server 213 in Figure 9), application servers for central controllers 7002, 7034 data analysis modules and an input / output ("I / O") interface 7007. Central servers 7013 of the cloud 7004 collectively manage the cloud computing system, which includes monitoring requests by central client controllers 7006 and manage the processing capacity of the 7004 cloud to execute requests. Each of the 7013 central servers comprises one or more processors
[00266] [00266] “Based on connections with several 7006 surgical centers through the 7001 network, the 7004 cloud can aggregate data from specific data generated by several 7012 surgical instruments and their corresponding 7006 central controllers. Such aggregated data can be stored in the aggregated medical data databases 7011 of the cloud 7004. In particular, the cloud 7004 can advantageously perform data analysis and operations on the aggregated data to produce insights and / or perform functions that individual central controllers 7006 could not achieve on their own. For this purpose, as shown in Figure 12, the cloud 7004 and the central surgical controllers 7006 are communicatively coupled to transmit and receive information. The I / O interface 7007 is connected to the plurality of central surgical controllers 7006 via the network 7001. In this way, the I / O interface 7007 can be configured to transfer information between the central surgical controllers 7006 and the databases. aggregated doctors 7011. Consequently, the I / O interface 7007 can facilitate read / write operations of the cloud-based data analysis system. Such read / write operations can be performed in response to requests from central controllers
[00267] [00267] The configuration of the specific cloud computing system described in this disclosure is designed specifically to address various issues raised in the context of medical operations and procedures performed using medical devices, such as surgical instruments 7012, 112 In particular, surgical instruments 7012 can be digital surgical devices configured to interact with the 7004 cloud to implement techniques to improve the performance of surgical operations. Various 7012 surgical instruments and 7006 central surgical controllers can comprise touch-controlled user interfaces, so that physicians can control aspects of interaction between the 7012 surgical instruments and the 7004 cloud. Other user interfaces suitable for control, such as interfaces controlled by auditory alert, can also be used.
[00268] [00268] Figure 13 is a block diagram that illustrates the functional architecture of the interactive surgical system implemented by computer, according to at least one aspect of the present disclosure. The cloud-based data analysis system includes a plurality of 7034 data analysis modules that can be run by the 7008 cloud 7004 processors to provide analytical data solutions for problems that arise specifically in the medical field. As shown in Figure 13, the functions of the 7034 cloud-based data analysis modules can be assisted through applications for central controllers 7014 hosted by application servers for central controllers 7002 that can be accessed on central surgical controllers 7006 The 7008 cloud computing processors and the 7014 central controller applications can operate together to perform the 7034 data analysis modules. The 7016 application program interface ("API") defines the set of protocols and routines that correspond to applications for central controllers 7014. Additionally, APIs 7016 manage the storage and retrieval of data in / from the aggregated medical databases 7011 for the operations of 7014 applications. 7018 cache memories also store data (for example, temporarily) and are coupled to APIs 7016 for more efficient retrieval data used by applications
[00269] [00269] For example, the 7022 data collection and aggregation module could be used to generate self-describing data (for example, metadata), including the identification of notable features or configuration (for example, trends), the management of sets of redundant data and the storage of data in paired data sets that can be grouped by surgery, but not necessarily linked to surgical dates and to real surgeons. In particular, paired data sets generated from operations of the 7012 surgical instruments may comprise application of a binary classification, for example, a bleeding or non-bleeding event. More generally, the binary classification can be characterized as a desirable event (for example, a successful surgical procedure) or an undesirable event (for example, a surgical instrument with failure or misuse 7012). The aggregated self-describing data can correspond to individual data received from various groups or subgroups of central surgical controllers 7006. Consequently, the 7022 data collection and aggregation module can manage aggregated metadata or other data organized based on raw data received of central surgical controllers
[00270] [00270] The resource optimization module 7020 can be configured to analyze these aggregated data to determine an optimal use of resources for a specific group or group of health posts. For example, the resource optimization module 7020 can determine an ideal ordering point for surgical stapling instruments 7012 for a group of clinics based on the corresponding expected demand for such instruments
[00271] [00271] The 7028 patient results analysis module can analyze surgical results associated with currently used operating parameters of 7012 surgical instruments. The 7028 patient results analysis module can also analyze and evaluate other potential operational parameters. In this context, the 7030 recommendations module could recommend the use of these other potential operating parameters based on obtaining better surgical results, such as better sealing or less bleeding. For example, the 7030 recommendation module could transmit recommendations to a central surgical controller 7006 about when to use a particular cartridge for a corresponding 7012 stapling surgical instrument. In this way, the cloud-based data analysis system, while controlling common variables, can be configured to analyze the large collection of raw data and provide centralized recommendations on multiple health posts (advantageously determined based on aggregated data) . For example, the cloud-based data analysis system could analyze, evaluate and / or aggregate data based on the type of medical practice, type of patient, number of patients, geographical similarity between medical providers, which providers / medical posts they use similar types of instruments, etc., in a way that no health post alone would be able to analyze independently.
[00272] [00272] The 7026 control program update module can be configured to implement various 7012 surgical instrument recommendations when the corresponding control programs are updated. For example, the Patient Results Analysis Module 7028 could identify correlations that link specific control parameters with successful (or unsuccessful) results. These correlations can be addressed when updated control programs are transmitted to 7012 surgical instruments via the 7026 control program update module. Updates to 7012 instruments that are transmitted via a corresponding central controller 7006 can incorporate aggregated performance data that has been gathered and analyzed by the data collection and aggregation module 7022 of the 7004 cloud. Additionally, the patient results analysis module 7028 and the recommendations module 7030 could identify better methods of using 7012 instruments based on aggregated performance data.
[00273] [00273] The cloud-based data analysis system can include safety features implemented by the 7004 cloud. These safety features can be managed by the authorization and safety module 7024. Each 7006 central surgical controller can have unique credentials associated with username, password and other appropriate security credentials. These credentials could be stored in memory 7010 and be associated with a level of access allowed to the cloud. For example, based on the provision of exact credentials, a 7006 central surgical controller can be granted access to communicate with the cloud to a predetermined point (for example, only certain defined types can participate in the transmission or receipt) information). For this purpose, the aggregated medical databases 7011 of the cloud 7004 can comprise a database of authorized credentials to verify the accuracy of the credentials provided. Different credentials can be associated with varying levels of permission to interact with the 7004 cloud, such as a predetermined level of access to receive data analysis generated by the 7004 cloud.
[00274] [00274] In addition, for security purposes, the cloud could maintain a database of 7006 central controllers, 7012 instruments and other devices that may comprise a "black list" of devices
[00275] [00275] Surgical instruments 7012 can use wireless transceivers to transmit wireless signals that can represent, for example, authorization credentials to access the corresponding central controllers 7006 and the 7004 cloud. Wired transceivers can also be used to transmit signals. These authorization credentials can be stored in the respective memory devices of the surgical instruments 7012. The authorization and security module 7024 can determine whether the authorization credentials are accurate or falsified. The 7024 authorization and security module can also dynamically generate authorization credentials for increased security. Credentials could also be encrypted, such as using hash-based encryption. After transmitting the appropriate authorization, surgical instruments 7012 can transmit a signal to the corresponding central controllers 7006 and finally to cloud 7004 to indicate that instruments 7012 are ready to obtain and transmit medical data. In response, the 7004 cloud can transition to a state enabled to receive medical data for storage in the aggregated medical databases 7011. This availability for data transmission could be indicated, for example, by a light indicator on the 7012 instruments. cloud 7004 can also transmit signals to 7012 surgical instruments to update its associated control programs. The 7004 cloud can transmit signals that are targeted to a specific class of 7012 surgical instruments (for example, electrosurgical instruments), so that software updates for control programs are transmitted only to the appropriate surgical instruments 7012. In addition, the 7004 cloud could be used to implement comprehensive system solutions to address local or global problems based on selective data transmission and authorization credentials. For example, if a group of surgical instruments 7012 is identified as having a common manufacturing defect, the 7004 cloud can change the authorization credentials that correspond to that group to implement an operational lock on the group.
[00276] [00276] The cloud-based data analysis system can allow monitoring of multiple health posts (for example, medical posts like hospitals) to determine improved practices and recommend changes (through the 2030 recommendations module, for example) accordingly . In this way, the 7008 processors of the 7004 cloud can analyze the data associated with an individual health post to identify the post and aggregate the data with other data associated with other posts in a group. Groups could be defined based on similar operating practices or geographic location, for example. In this way, the 7004 cloud can provide analysis and recommendations for the entire group of health posts. The cloud-based data analysis system could also be used to increase situational awareness. For example, 7008 processors can predictively model the effects of recommendations on cost and effectiveness for a specific post (in relation to general operations and / or various medical procedures). The cost and effectiveness associated
[00277] [00277] The 7032 data classification and prioritization module can prioritize and classify data based on severity (for example, the severity of a medical event associated with the data, unpredictability, distrust). This classification and prioritization can be used in conjunction with the functions of the other 7034 data analysis modules described above to improve the operation and analysis of cloud-based data described here. For example, the 7032 data classification and prioritization module can assign a priority to the data analysis performed by the 7022 data collection and aggregation module and the 7028 patient outcome analysis module. Different levels of prioritization can result in specific responses from the 7004 cloud (corresponding to an urgency level), such as escalation to an accelerated response, special processing, exclusion from the aggregated medical databases 7011 or other appropriate responses. In addition, if necessary, the 7004 cloud can transmit a request (for example, a push message) through the application servers to central controllers for additional data from corresponding 7012 surgical instruments. The push message can result in a notification displayed on the corresponding central controllers 7006 to request support or additional data. This push message may be necessary in situations where the cloud detects an irregularity or results outside significant limits and the cloud cannot determine the cause of the irregularity. The 7013 central servers can be programmed to activate this push message under certain significant circumstances, such as when data is determined to be different from an expected value beyond a predetermined threshold or when it appears that security has been compounded, for example.
[00278] [00278] Additional details related to the cloud data analysis system can be found in US Provisional Patent Application No. 62 / 659,900, entitled METHOD OF HUB COMMUNICATION, filed on April 19, 2018, which is incorporated herein as title reference, in its entirety.
[00279] [00279] “Although a" smart "device, including control algorithms responsive to detected data, can be an improvement over a" stupid "device that operates without taking the detected data, some detected data can be incomplete or inconclusive when considered in isolation, that is, without the context of the type of surgical procedure being performed or the type of tissue that is undergoing the surgery. Without knowing the context of the procedure (for example, knowing the type of tissue that is undergoing surgery, or the type of procedure that is being performed), the control algorithm may control the modular device incorrectly or suboptimally, detected data without specific context is provided. For example, the ideal way for a control algorithm to control a surgical instrument in response to a particular parameter detected may vary according to the type of particular tissue being operated on. This is due to the fact that different types of tissue have different properties (for example, tear resistance) and thus respond differently to actions performed by surgical instruments. Therefore, it may be desirable for a surgical instrument to perform different actions when the same measurement is detected for a specific parameter. As a specific example, the optimal way in which to control a stapling and surgical cutting instrument in response to the instrument detecting an unexpectedly high force to close its end actuator, will vary depending on whether the type of tissue is susceptible or tear resistant. Pan-
[00280] [00280] “A solution uses a central surgical controller including a system configured to derive information about the surgical procedure that is being performed based on data received from various data sources, and then control, accordingly, the devices paired modules. In other words, the central surgical controller is configured to infer information about the surgical procedure from the data received and, then, control the modular devices paired with the central surgical controller based on the inferred context of the procedure. surgical procedure. Figure 14 illustrates a diagram of a surgical system with 5100 situational recognition, in accordance with at least one aspect of the present disclosure. In some examples, data sources 5126 include, for example, modular devices 5102 (which may include sensors configured to detect parameters associated with the patient and / or the modular device itself), databases 5122 (for example, an EMR database containing the patient's medical record), and 5124 monitoring devices (for example, a blood pressure (BP) monitor and an electrocardiography (ECG) monitor)).
[00281] [00281] A central surgical controller 5104 that can be similar to surgical controller 106 in many ways, can be configured to derive contextual information related to the surgical procedure from the data based, for example, on the combination (s) ( specific data (s) received or in the specific order in which data are received from data sources 5126. Contextual information inferred from data received may include, for example, the type of surgical procedure being performed, the stage specific to the surgical procedure that the surgeon is performing, the type of tissue being operated on, or the body cavity that is the object of the procedure. This ability for some aspects of the 5104 central surgical controller to derive or infer information related to the surgical procedure from received data, can be called "situational perception." In one example, the 5104 central surgical controller can incorporate a situational perception system, which is the hardware and / or programming associated with the central surgical controller 5104 that derives contextual information related to the surgical procedure based on the data received.
[00282] [00282] The situational perception system of the central surgical controller 5104 can be configured to derive contextual information from data received from data sources 5126 in several ways. In one example, the situational perception system includes a pattern recognition system, or machine learning system (for example, an artificial neural network), which has been trained in training data to correlate various inputs ( for example, data from databases 5122, patient monitoring devices 5124, and / or modular devices 5102) to corresponding contextual information regarding a surgical procedure. In other words, a machine learning system can be trained to accurately derive contextual information regarding a surgical procedure from the inputs provided. In another example, the situational perception system may include a lookup table that stores pre-characterized contextual information regarding a surgical procedure in association with one or more entries (or ranges of entries) corresponding to the contextual information. In response to a query with one or more entries, the lookup table can return the corresponding contextual information to the situational perception system to control the 5102 modular devices. In one example, the contextual information received by the system's situational perception system central surgical controller 5104, are associated with a control setting or set of specific control settings for one or more modular devices 5102. In another example, the situational perception system includes a system of learning by additional machine, search table or other system of this type, generating or retrieving one or more control settings for one or more 5102 modular devices, when contextual information is provided as input.
[00283] [00283] A 5104 central surgical controller, which incorporates a situational perception system, provides several benefits to the 5100 surgical system. One benefit includes improving the interpretation of detected and captured data, which in turn improves the accuracy of processing and / or using the data during the course of a surgical procedure. To return to a previous example, a 5104 central surgical controller with situational awareness could determine what type of tissue was being operated on; therefore, when an unexpectedly high force is detected to close the end actuator of the surgical instrument, the central surgical controller with situational perception 5104 could correctly accelerate or decelerate the surgical instrument motor for the type of tissue.
[00284] [00284] As another example, the type of fabric being operated
[00285] [00285] As yet another example, the type of body cavity being operated during an insufflation procedure can affect the function of a smoke evacuator. A central surgical controller with situational perception 5104 can determine if the surgical site is under pressure (by determining that the surgical procedure is using insufflation) and determine the type of procedure. As a type of procedure is generally performed in a specific body cavity, the 5104 central surgical controller can then adequately control the speed of the smoke evacuator motor to the body cavity being operated on. In this way, a central surgical controller with 5104 situational awareness can provide a consistent amount of smoke evacuation to both thoracic and abdominal procedures.
[00286] [00286] As yet another example, the type of procedure being performed can affect the ideal energy level for an ultrasonic surgical instrument or radio frequency (RF) electrosurgical instrument to operate. Arthroscopic procedures, for example, require higher energy levels because the end actuator of the ultrasonic surgical instrument or RF electrosurgical instrument is immersed in fluid. A central surgical controller with situational perception 5104 can determine whether the surgical procedure is an arthroscopic procedure. The central surgical controller 5104 can then adjust the RF power level or the ultrasonic amplitude of the generator (i.e., the "energy level") to compensate for the fluid-filled environment. Related to this, the type of tissue being operated on can affect the ideal energy level at which an ultrasonic surgical instrument or RF electrosurgical instrument operates. A central surgical controller with situational awareness 5104 can determine what type of surgical procedure is being performed and then customize the energy level for the ultrasonic surgical instrument or RF electrosurgical instrument, respectively, according to the profile of tissue expected for the surgical procedure. In addition, a central surgical controller equipped with 5104 situational awareness can be configured to adjust the energy level for the ultrasonic surgical instrument or RF electrosurgical instrument throughout the course of a surgical procedure, rather than just on a procedural basis. -by-procedure. A central surgical controller with situational perception 5104 can determine which stage of the surgical procedure is being performed or will be performed subsequently and then update the control algorithms for the generator and / or ultrasonic surgical instrument or RF electrosurgical instrument for adjust the energy level to an appropriate value for the type of tissue, according to the stage of the surgical procedure.
[00287] [00287] Yet as another example, data can be extracted from additional data sources 5126 to improve the conclusions that the central surgical controller 5104 extracts from a data source 5126. A central surgical controller with situational perception 5104 can augment the data that he receives from modular devices 5102 with contextual information that he has accumulated, referring to the surgical procedure, from other data sources 5126. For example, a central surgical controller with situational perception 5104 can be configured to determine if hemosystems occurred tasia (that is, if bleeding has stopped at a surgical site), according to video or image data received from a medical imaging device. However, in some cases, video or image data may be inconclusive. Therefore, in one example, the central surgical controller 5104 can be additionally configured to compare a physiological measurement (for example, blood pressure detected by a PA monitor communicatively connected to the central surgical controller 5104) with visual data or hemostasis imaging (for example, from a medical imaging device 124 (Figure 2) coupled communicably to the central surgical controller 5104) to make a determination on the integrity of the staple line or tissue union. In other words, the situational perception system of the central surgical controller 5104 can consider the physiological measurement data to provide additional context in the analysis of the visualization data. The additional context can be useful when the visualization data can be inconclusive or incomplete on its own.
[00288] [00288] Another benefit includes proactively and automatically controlling paired modular devices 5102, according to the specific stage of the surgical procedure being performed to reduce the number of times medical personnel are required to interact com or control the 5100 surgical system during the course of a surgical procedure. For example, a central surgical controller with situational perception 5104 can proactively activate the generator to which an electrosurgical instrument of
[00289] [00289] “As another example, a central surgical controller with situational perception 5104 could determine whether the current or subsequent stage of the surgical procedure requires a different view or degree of magnification of the screen, according to the (s) resource (s) at the surgical site that the surgeon is expected to see. The central surgical controller 5104 could then proactively change the displayed view (provided, for example, by a medical imaging device to the visualization system 108), so that the screen automatically adjusts throughout the procedure surgical.
[00290] [00290] As yet another example, a central surgical controller with situational perception 5104 could determine which stage of the surgical procedure is being performed or will be performed subsequently and whether specific data or comparisons between the data will be required for that stage of the procedure. surgical procedure. The central surgical controller 5104 can be configured to call screens automatically based on data on the stage of the surgical procedure being performed, without waiting for the surgeon to request specific information.
[00291] [00291] - Another benefit includes checking for errors during the configuration of the surgical procedure or during the course of the surgical procedure. For example, a central surgical controller with situational perception 5104 could determine whether the operating room is properly or ideally configured for the surgical procedure to be performed. Central surgical controller 5104 can be configured to determine the type of surgical procedure being performed, retrieve the corresponding checklists,
[00292] [00292] As another example, the central surgical controller with situational perception 5104 could determine whether the surgeon (or other medical personnel) was making a mistake or otherwise deviating from the expected course of action during the course of a pro - surgical procedure. For example, the central surgical controller 5104 can be configured to determine the type of surgical procedure being performed, retrieve the corresponding list of steps or order of use of the equipment (for example, from a memory), and then compare the steps being performed or the equipment being used during the course of the surgical procedure with the steps or with the equipment expected for the type of surgical procedure that the 5104 central surgical controller determined is being performed. In one example, the 5104 central surgical controller can be configured to provide an alert indicating that an unexpected action is being taken or an unexpected device is being used at the specific stage in the surgical procedure.
[00293] [00293] In general, the situational perception system for the 5104 central surgical controller improves the results of the surgical procedure by adjusting the surgical instruments (and other modular devices 5102) for the specific context of each surgical procedure (such as the different types of tissue), and when validating actions during a surgical procedure. The situational perception system also improves the surgeon's efficiency in carrying out surgical procedures by automatically suggesting the next steps, providing data, and adjusting screens and other 5102 modular devices in the operating room, according to specific context of the procedure.
[00294] [00294] “With reference now to Figure 15, a time line 5200 is shown representing the situational recognition of a central controller, such as the central surgical controller 106 or 206 (Figures 1 to 11), for example. Timeline 5200 is an illustrative surgical procedure and the contextual information that the central surgical controller 106, 206 can derive from data received from data sources at each stage in the surgical procedure. Timeline 5200 represents typical steps that would be taken by nurses, surgeons, and other medical personnel during the course of a segmentation procedure.
[00295] [00295] Situational recognition of a central surgical controller 106, 206 receives data from data sources throughout the course of the surgical procedure, including the data generated each time medical personnel use a modular device that is paired with the center surgical 106, 206. The central surgical controller 106, 206 can receive this data from paired modular devices and other data sources and continually derive inferences (ie, contextual information) about the ongoing procedure as new data is received, such as which stage of the procedure is being performed at any given time. The situational recognition system of the central surgical controller 106, 206 is, for example, able to record data related to the procedure to generate reports, verify the steps being taken by medical personnel, provide data or warnings (for example, through of a display screen) that may be relevant to the specific step of the procedure, adjust the modular devices based on the context (for example, activate monitors, adjust the field of view (FOV) of the medical imaging device, or change the energy level of an ultrasonic surgical instrument or RF electrosurgical instrument), and take any other action described above.
[00296] [00296] In the first step 5202, in this illustrative procedure, the members of the hospital team retrieve the electronic patient record (PEP) from the hospital's PEP database. Based on patient selection data in the PEP, the central surgical controller 106, 206 determines that the procedure to be performed is a thoracic procedure.
[00297] [00297] In the second step 5204, the team members scan the entry of medical supplies for the procedure. Central surgical controller 106, 206 cross-references the scanned supplies with a list of supplies that are used in various types of procedures and confirms that the supply mix corresponds to a thoracic procedure. In addition, the central surgical controller 106, 206 is also able to determine that the procedure is not a wedge procedure (because inlet supplies have an absence of certain supplies that are necessary for a thoracic wedge procedure or, otherwise, that the incoming supplies do not correspond to a thoracic wedge procedure).
[00298] [00298] In the third step 5206, the medical staff scans the patient's bank with a scanner that is communicably connected to the central surgical controller 106, 206. The central surgical controller 106, 206 can then confirm the patient's identity with based on the scanned data.
[00299] [00299] In the fourth step 5208, the medical personnel turns on the auxiliary equipment. The auxiliary equipment being used may vary according to the type of surgical procedure and the techniques to be used by the surgeon, but in this illustrative case they include a smoke evacuator, an insufflator and a medical imaging device. When activated, auxiliary equipment that is a modular device can automatically pair with the central surgical controller 106, 206 which is located within a specific neighborhood of the modular devices as part of its initialization process. The central surgical controller 106, 206 can then derive contextual information about the surgical procedure by detecting the types of modular devices that correspond with the same during this preoperative or initialization phase. In this particular example, the central surgical controller 106, 206 determines that the surgical procedure is a VATS procedure (video-thoracic surgery)
[00300] [00300] In the fifth step 5210, the team members fix the electrocardiogram (ECG) electrodes and other patient monitoring devices on the patient. ECG electrodes and other patient monitoring devices are able to pair with the central surgical controller 106, 206. As central surgical controller 106, 206 begins to receive data from patient monitoring devices, the surgical controller central 106, 206 thus confirms that the patient is in the operating room.
[00301] [00301] In the sixth step 5212, the medical personnel induced anesthesia in the patient. Central surgical controller 106, 206 can infer that the patient is under anesthesia based on data from modular devices and / or patient monitoring devices, including ECG data, blood pressure data, ventilator data, or combinations thereof, for example. After the completion of the sixth step 5212, the preoperative portion of the lung segmentectomy procedure is completed and the operative portion begins.
[00302] [00302] In the seventh step 5214, the lung of the patient who is being operated on is retracted (while ventilation is switched to the contralateral lung). The central surgical controller 106, 206 can infer from the ventilator data that the patient's lung has been retracted, for example. The central surgical controller 106, 206 can infer that the operative portion of the procedure started when it can compare the detection of the patient's lung collapse in the expected stages of the procedure (which can be accessed or retrieved earlier) and thus determine that lung retraction is the first operative step in this specific procedure.
[00303] [00303] In the eighth step 5216, the medical imaging device (for example, a display device) is inserted and the video from the medical imaging device is started. Central surgical controller 106, 206 receives data from the medical imaging device (i.e., video or image data) through its connection to the medical imaging device. After receiving data from the medical imaging device, the central surgical controller 106, 206 can determine that the portion of the laparoscopic surgical procedure has started. In addition, the central surgical controller 106, 206 can determine that the specific procedure being performed is a segmentectomy, rather than a lobectomy (note that a wedge procedure has already been discarded by the central surgical controller 106, 206 with based on the data received in the second step 5204 of the procedure). The data from the medical imaging device 124 (Figure 2) can be used to determine contextual information about the type of procedure being performed in a number of different ways, including by determining the angle at which the medical imaging device is. oriented towards visualizing the patient's anatomy, monitoring the number or medical imaging devices being used (that is, which are activated and stopped)
[00304] [00304] In the ninth step 5218 of the procedure, the surgical team starts the dissection step. Central surgical controller 106, 206 can infer that the surgeon is in the process of dissecting to mobilize the patient's lung because he receives data from the RF or ultrasonic generator that indicate that an energy instrument is being triggered. The central surgical controller 106, 206 can cross the received data with the steps retrieved from the cyclic procedure
[00305] [00305] In the tenth step 5220 of the procedure, the surgical team proceeds to the connection step. Central surgical controller 106, 206 can infer that the surgeon is ligating the arteries and veins because he receives data from the surgical stapling and cutting instrument indicating that the instrument is being fired. Similar to the previous step, the central surgical controller 106, 206 can derive this inference by crossing the reception data from the surgical stapling and cutting instrument with the steps recovered in the process. In some cases, the surgical instrument can be a surgical tool mounted on a robotic arm of a robotic surgical system.
[00306] [00306] In the eleventh step 5222, the segmentation portion of the procedure is performed. Central surgical controller 106, 206 can infer that the surgeon is transecting the parenchyma based on data from the surgical stapling and cutting instrument, including data from its cartridge. The cartridge data can correspond to the size or type of clamp being triggered by the instrument, for example. As different types of staples are used for different types of fabrics, the cartridge data can thus indicate the type of fabric being stapled and / or transected. In this case, the type of clamp that is fired is used for the parenchyma (or other similar types of tissue), which allows the central surgical controller 106, 206 to infer which segmentectomy portion of the procedure is being performed.
[00307] [00307] In the twelfth step 5224, the node dissection step is then performed. The central surgical controller 106, 206 can infer that the surgical team is dissecting the node and performing a leak test based on the data received from the generator that indicates which ultrasonic or RF instrument is being fired. For this specific procedure, an RF or ultrasonic instrument being used after the parenchyma has been transected corresponds to the node dissection step, which allows the central surgical controller 106, 206 to make this inference. It should be noted that surgeons regularly switch between surgical stapling / cutting instruments and surgical energy instruments (that is, RF or ultrasonic) depending on the specific step in the procedure because different instruments are better adapted for specific tasks. Therefore, the specific sequence in which cutting / stapling instruments and surgical energy instruments are used can indicate which stage of the procedure the surgeon is performed on. In addition, in certain cases, robotic tools can be used for one or more steps in a surgical procedure and / or Hand held surgical instruments can be used for one or more steps in the surgical procedure. The surgeon can switch between robotic tools and hand-held surgical instruments and / or can use the devices simultaneously, for example. After the completion of the twelfth stage 5224, the incisions are closed and the post-operative portion of the process begins.
[00308] [00308] In the thirteenth stage 5226, the patient's anesthesia is reversed. The central surgical controller 106, 206 can infer that the patient is emerging from anesthesia based on ventilator data (that is, the patient's respiratory rate begins to increase), for example.
[00309] [00309] Finally, in the fourteenth step 5228 is that medical personnel remove the various patient monitoring devices from the patient. Central surgical controller 106, 206 can thus infer that the patient is being transferred to a recovery room when the central controller loses ECG, blood pressure and other data from patient monitoring devices. As can be seen from the description of this illustrative procedure, the central surgical controller 106, 206 can determine or infer when each step of a given surgical procedure is taking place according to the data received from the various data sources that are communicably coupled to the central surgical controller 106,
[00310] [00310] Situational recognition is further described in US Provisional Patent Application serial number 62 / 659,900, entitled ME-THOD OF HUB COMMUNICATION, filed on April 19, 2018, which is hereby incorporated by reference in its entirety. In certain cases, the operation of a robotic surgical system, including the various robotic surgical systems disclosed here, for example, can be controlled by the central controller 106, 206 based on its situational recognition and / or feedback from the components thereof and / or based on information from the cloud 104. Evaluation of the surgical team
[00311] [00311] In some respects, the computer systems described here are programmed to evaluate the surgical team during the course of a surgical procedure (for example, how they are using the surgical instruments) and to propose suggestions for improving techniques or actions of the surgical team members. In one aspect, the computer systems described here, such as the central surgical controllers 106, 206 (Figures 1 to 11), can be programmed to analyze the techniques, physical characteristics, and / or performances of a surgeon and / or other members of the surgical team in relation to a baseline. In addition, the computer system can be programmed to provide notifications or warnings that indicate when the surgical team is deviating from the baseline so that the surgical team can change its actions and optimize its performance or technique. In some aspects, notifications may include warnings that the surgical team is not using an appropriate technique (which may additionally include recommendations on corrective measures that the surgical team can take to address this item), suggestions for alternative surgical products, statistics on correlations between procedural variables (for example, time required to complete the procedure) and monitored physical characteristics of the surgical team, comparisons between surgeons, and so on. In many respects, notifications or recommendations can be provided in real time (for example, in the OR during the surgical procedure) or in a post-procedure report. Consequently, the computer system can be programmed to automatically analyze and compare techniques and skills of using team members' instruments.
[00312] [00312] Figure 16 is a diagram of a configuration of an illustrative OS configuration, in accordance with at least one aspect of the present disclosure. In various implementations, the central surgical controller 211801 can be connected to one or more multiple cameras 211802, surgical instruments 211810, screens 211806, and other surgical devices within the SO 211800 via a communication protocol (for example, Bluetooth ), as described above under the heading "CENTRAL SURGICAL CONTROLLERS". The 211802 cameras can be oriented to capture images and / or video of the members of the 211803 surgical team during the course of a surgical procedure. Consequently, the central surgical controller 211801 can receive the captured image and / or video data from
[00313] [00313] Figure 17 is a logical flow chart of a 211000 process for visually evaluating members of the surgery team, according to at least one aspect of the present disclosure. In the following description of process 211000, reference should also be made to Figures 10 and 16. Process 211000 can be performed by a processor or control circuit of a computer system, such as processor 244 of the central surgical controller 206 illustrated in Figure 10. Consequently, process 211000 can be incorporated as a set of instructions executable by a computer stored in a memory 249 that, when executed by processor 244, makes the computer system (for example, a surgical controller central 211801) perform the steps described.
[00314] [00314] As described above under the heading "CENTRAL SURGICAL CONTROLLERS", computer systems, such as central surgical controllers 211801, can be connected or paired with a variety of surgical devices, such as surgical instruments, generators, evacuators smoke, screens and so on. Through their connections to these surgical devices, the 211801 central surgical controllers can receive an array of perioperative data from these paired surgical devices while the devices are in use during a surgical procedure. Additionally, as described above under the heading SITUATIONAL ACKNOWLEDGMENT, central surgical controllers 211801 can determine the context of the surgical procedure being performed (for example, the type of procedure or the stage of the procedure being performed) with based, at least in part, on the perioperative data received from these connected surgical devices. Consequent-
[00315] [00315] Consequently, processor 244 captures 211006 the image (s) of the surgical team that is performing the surgical procedure through, for example, cameras 211802 positioned inside SO 211800. The image (s) The captured image (s) can include still images or moving images (ie video). The surgical team's images can be captured at a variety of angles and magnifications, use different filters, and so on. In one implementation, cameras 211802 are arranged within SO 211800 so that they can collectively view each member of the surgical team performing the procedure.
[00316] [00316] Consequently, processor 244 determines 211008 a physical characteristic of one or more members of the surgical team from the captured image (s). For example, the physical characteristic may include posture, as discussed in connection with Figures 18 to 19, or wrist angle, as discussed in connection with Figures 20 to 21. In other implementations, the physical characteristic may include the position, orientation, angle, or rotation of an individual's head, shoulders, torso, elbows, legs, hips, and so on. The physical characteristic can be determined 211008 using a variety of techniques of machine vision, image processing, object recognition and optical tracking. In one aspect, the physical characteristic can be determined 211008 by processing the captured images to detect the edges of objects in the images and comparing the detected images with a template of the body part being evaluated. Once the body part being evaluated has been recognized, its position, orientation, and other characteristics can be tracked by comparing the movement of the tracked body part in relation to known camera positions.
[00317] [00317] Consequently, processor 244 assesses 211010 the determined physical characteristic of the surgical team member with a baseline. In one aspect, the baseline can correspond to the surgical context determined through situational recognition. The processor 244 can retrieve the baselines for various physical characteristics from a memory (for example, the memory 249 illustrated in Figure 10), according to the given surgical context, for example. The baseline may include values or ranges of values for specific physical characteristics to be tracked during specific surgical contexts. The types of physical characteristics evaluated in different surgical contexts can be the same or exclusive for each specific surgical context.
[00318] [00318] In one aspect, the 244 processor can provide real-time feedback to members of the surgical team during
[00319] [00319] In one aspect, one or more of the 211000 process steps can be performed by a second computer system or remote computer system, such as the cloud computing systems described under the title CLOUD-BASED SYSTEM HARDWARE AND FUNCTIONAL MODULES . For example, the central surgical controller 211801 can receive 211002 perioperative data from connected surgical devices, determine 211004 the surgical context based at least in part on perioperative data, capture
[00320] [00320] Figures 18 to 19 illustrate a predictive implementation of the 211000 process illustrated in Figure 17, where the physical characteristic being evaluated is the posture of a member of the surgical team. Figure 18 is a diagram illustrating a series of models 211050a, 211050b, 211050c, 211050d of a member of the surgical team 211052 during the course of a surgical procedure, in accordance with at least one aspect of the present disclosure. Correspondingly, Figure 19 is a 211000 graph representing the measured posture of the surgical team member illustrated in Figure 18 over time, in accordance with at least one aspect of the present disclosure. Figures 16 to 17 should also be mentioned in the description of Figures 18 to 19 below. Consequently, the central surgical controller 211801 that performs the 211000 process can analyze the position of the surgical team member and provide recommendations if the position of the surgical team member deviates from the baseline. Bad, unexpected, or otherwise inadequate posture may indicate, for example, that the surgeon is tired, is having difficulty with a specific surgical step, is using the surgical instrument incorrectly, or is acting otherwise. a potentially risky way that can be dangerous. Therefore, monitoring the postures of members of the surgical team during the course of a surgical procedure and providing notifications when a team member is deviating from a baseline posture, can be beneficial to alert unsuspecting users regarding their risky conduct so that they can take corrective actions or allow other individuals to take corrective actions (for example, exchanging a tired member of the surgical team for one who is more rested).
[00321] [00321] With reference to Figure 19, the vertical geometric axis 211102 of graph 211100 represents the posture of an individual, and the horizontal geometric axis 211104, represents time. The first model 211050a in Figure 18 corresponds to moment t, in Figure 19 during the surgical procedure, the second model 211050b corresponds to moment t2, the third model 211050c corresponds to moment t3, and the fourth model 211050d corresponds to moment ta. Taken together, Figures 18 and 19 illustrate that the posture of the individual being evaluated deviates more and more from the baseline position (s) during the course of the surgical procedure.
[00322] [00322] In one aspect, the posture of the individual being evaluated by the computer system can be quantified as a metric corresponding to the deviation in the position of one or more locations in the individual's body in relation to the corresponding initial positions or limit. For example, Figure 18 illustrates the change in a head position 211054, in a shoulder position 211056, in a hip position 211058 of the subject modeled over time by a first line 211055, a second line 211057, and a third line 211059, respectively. In an aspect that uses a marker-based optical system, the surgeon's uniform may have a marker located at one or more of these locations that can be tracked by the optical system, for example. In an aspect that uses an optical system without a marker, the optical system can be configured to identify the member of the surgical team and optically track the location and movement of one or more parts or locations of the body of the identified surgical team member. In addition, the positions of the head, shoulders and hips 211054, 211056, 211058, can be compared with a position of the head of the baseline 211060, a position of the shoulders of the baseline 211062, and a position of the hips of the base 211064, respectively. Baseline positions 211060, 211062, 211064 can correspond to the initial positions of the respective body parts (that is, the positions at the moment to in Figure 19) or they can be predetermined limits against which the positions of the body parts are compared . In one respect, the posture metric (as represented by the vertical geometric axis 211102 in graph 211100) can be equal to the distance between one of the body positions 211054, 211056, 211058 and its corresponding baseline positions 211060, 211062, 211064. In another aspect, the posture metric can be equal to the cumulative distance between more than one of the body positions 211054, 211056, 211058 and its corresponding baseline positions 211060, 211062, 211064. The first line 211108 in graph 211100 it represents the gross values of posture metric over time, and the second line 211106 represents the normalized values of posture metric over time. In several respects, the 211000 process can evaluate 211010 if the physical characteristic (in this case, posture) has deviated from the baseline according to raw or mathematically manipulated data (eg normalized).
[00323] [00323] In one aspect, the central surgical controller 211801 that performs the 211000 process can compare the metric of the calculated posture with one or more limits and then take various actions accordingly. In the represented implementation, the central surgical controller 211801 compares the posture metric with a first limit 211110 and a second limit 211112. If the metric of the normalized posture, represented by the second line 211106 exceeds the first limit 211110, then the central surgical controller 211801 can be configured to provide a first notification or alert to the surgical team at SO 211800 that indicates that there is a potential risk to the particular individual's shape. Additionally, if the normalized posture metric, represented by the second line 211106, exceeds the second limit 211112, then the central surgical controller 211801 can be configured to provide a second notification or alert to users on SO 211800 that indicates that there is a high degree of risk with the shape of the individual in particular. For example, at time t1, the posture metric for the assessed surgical team member, as represented by the fourth model 211050d, exceeds the first limit 211110; consequently, the central surgical controller 211801 can be configured to provide a first warning or an initial warning to the surgical team.
[00324] [00324] Figures 20 to 21 illustrate a predictive implementation of the 211000 process illustrated in Figure 17, where the physical characteristic being evaluated is the angle of the wrist of a member of the surgical team. Figure 20 is a representation of a surgeon holding a 211654 surgical instrument, in accordance with at least one aspect of the present disclosure. Correspondingly, Figure 21 is a 211700 scatter plot of pulse angle versus results of surgical procedure, in accordance with at least one aspect of the present disclosure. Figures 16 to 17 should also be mentioned in the description of Figures 20 to 21 below. Consequently, the central surgical controller 211801, which performs the 211000 process, can analyze the pulse angle of a hand of a member of the surgical team holding a 211654 surgical instrument and provide recommendations if the pulse angle of the team member is correct. deviate from the baseline. Strangely holding a surgical instrument, as evidenced by an extreme angle of the wrist in relation to the surgical instrument, may indicate, for example, that the surgeon is using the surgical instrument incorrectly, positioned the surgical instrument incorrectly , is using an incorrect surgical instrument for the specific stage of the procedure, or is otherwise acting in a potentially risky manner that can be dangerous.
[00325] [00325] In this specific implementation, the pulse angle of the individual 211650 is defined as the angle a between the longitudinal geometric axis 211656 of the surgical instrument 211654 being wielded
[00326] [00326] In one aspect, the 211801 central surgical controller that performs the 211000 process can compare the pulse angle with one or more limits and then take various actions accordingly. In the implementation shown, the central surgical controller 211801 determines whether the angle of the surgeon's pulse a falls within a first zone, which is outlined by a first limit 211708a and a second limit 211708b, within a second zone, which it is outlined by a third limit 211706a and a fourth limit 211706b, or outside the second zone. If the pulse angle a measured by the central surgical controller 211801 during the course of a surgical procedure falls within the first and second limits 221708a, 221708b, then the central surgical controller 211801 can be configured to determine that the pulse angle a is within acceptable parameters and will do nothing. If the surgeon's pulse angle a falls between the first and second limits 221708a, 221708b, and the third and fourth limits 221706a, 221706b, then the central surgical controller 211801 can be configured to provide a first notification or alert to the surgical team at the SO 211800 that indicates that there is a potential risk with the shape of the specific individual. Additionally, if the surgeon's pulse angle a is outside the third and fourth limits 221706a, 221706b, then the central surgical controller 211801 can be configured to provide a second notification or alert to users on SO 211800 that indicates that there is a degree of risk elevated with the shape of the specific individual.
[00327] [00327] In some respects, the various limits or baselines with which the monitored physical characteristic is compared, can be determined empirically. The 211801 central surgical controllers and / or the cloud-based system described above under the CLOUD-BASED SYSTEM HARDWARE AND FUNCTIONAL MODULES can capture data related to various physical characteristics of the surgical team members from a population of samples from surgical procedures for analysis. In one aspect, the computer system can correlate these physical characteristics with various surgical results and then define the limits or baselines according to the specific physical characteristics of the surgeon or other members of the surgical team that are most highly correlated with results. positive surgical procedures. Consequently, a 211801 central surgical controller that runs the 211000 process can provide notifications or alerts when members of the surgical team are deviating from best practices. In another aspect, the computer system can define the limits or baselines according to the physical characteristics that are most often displayed within the sample population. Consequently, a 211801 central surgical controller that runs the 211000 process can provide notifications or alerts when members of the surgical team are deviating from the most common practices. For example, in Figure 21, the first and second limits 211708a, 211708b can be adjusted to match the most common pulse angle displayed by a surgeon when performing the specific surgical procedure (that is, the densest portion scatter plot 211700). Consequently, when a 211801 central surgical controller that performs the process
[00328] [00328] In one aspect, the physical characteristic being tracked by the central surgical controller 211801 can be differentiated according to the type of product. Consequently, the central surgical controller 211801 can be configured to notify members of the surgical team when the specific physical characteristic being tracked corresponds to a different type of product. For example, the central surgical controller 211801 can be configured to notify the surgeon when the surgeon's arm and / or wrist deviates from the baseline for the specific surgical instrument currently being used and, thus, indicating that a different surgical instrument would be more appropriate.
[00329] [00329] In one aspect, the central surgical controller 211801 can be configured to compare the external orientation of a 211810 surgical instrument with the internal access orientation of its end actuator. The external orientation of the 211810 surgical instrument can be determined using the 211802 cameras and optical systems described above. The internal orientation of the 211810 surgical instrument end actuator can be determined using an endoscope or other scopic probe used to view the surgical site. By comparing the external and internal orientations of the surgical instrument 211810, the central surgical controller 211801 can then determine whether a different type of surgical instrument 211801 would be more suitable. For example, the central surgical controller 211801 can be configured to provide notification to the surgical team if the external orientation of the surgical instrument 211810 deviates from the internal orientation of the end actuator of the surgical instrument 211810 by more than a threshold degree.
[00330] [00330] In short, computer systems, such as a central surgical controller 211801, can be configured to provide recommendations to a member of the surgical team (for example, a surgeon) as the technique of the surgical team member begins to deviate from the best practice or the most common practice. In some aspects, the computer system can be configured to provide only notifications or feedback when the individual has repeatedly exhibited suboptimal behavior during the course of a surgical procedure. Notifications provided by computer systems may suggest, for example, that the surgical team member adjust their technique to match the ideal technique for the type of procedure, which uses a more suitable instrument, and so on.
[00331] [00331] In one aspect, the computer system (for example, a central surgical controller 211801) can be configured to allow members of the surgical team to compare the technique with themselves, instead of comparing with the baselines established by sample population or those pre-programmed in the computer system. In other words, the baseline with which the computer system compares the surgical team member may be the surgical team member's previous performance in a specific type of surgical procedure or a previous case of using a surgical procedure. specific type of surgical instrument. Such aspects can be useful to enable surgeons to track improvements in their surgical techniques or to document training periods for new surgical products. Consequently, the central surgical controller 211801 can be configured to evaluate products during a test period and provide highlights of the use of the products during the given period.
[00332] [00332] In one aspect, the computer system (for example, a central surgical controller 211801) can be configured to allow members of the surgical team to compare their technique directly with other surgeons, instead of comparing with | established by the sample population or pre-programmed in the computer system.
[00333] [00333] In one aspect, the computer system (for example, a central surgical controller 211801) can be configured to analyze trends in the use of the surgical device as surgeons become more experienced in performing specific surgical procedures (or performance surgical procedures in general) or the use of new surgical instruments. For example, the computer system can identify movements, behaviors and other physical characteristics that change dramatically as surgeons become more experienced. Consequently, the computer system can recognize when a surgeon is displaying sub-ideal techniques early on the surgeon's learning curve and can provide recommendations on the best approach, before the sub-ideal technique becomes too ingrained in the surgeon. Examples
[00334] [00334] Various aspects of the subject described in this document are defined in the following numbered examples:
[00335] [00335] Example 1. A computer system configured to be communicatively coupled to a surgical device and a camera. The computer system comprises a processor and a memory coupled to the processor. The memory stores instructions that, when executed by the processor, cause the computer system to: receive perioperative data from the surgical device; determine a surgical context based, at least in part, on perioperative data; receive an image of an individual through the camera; determine a physical characteristic of the individual from the image; recover a baseline physical characteristic corresponding to the surgical context; and determine whether the individual's physical characteristic deviates from the baseline physical characteristic.
[00336] [00336] Example 2. The computer system of Example 1, the physical characteristic of which comprises an individual's posture.
[00337] [00337] Example 3. The computer system of Example 2, since the individual's posture corresponds to a deviation from at least one position of the body part and a reference position.
[00338] [00338] Example 4.0 The computer system of Example 1, the physical characteristic of which comprises an orientation of the individual's pulse.
[00339] [00339] Example 5. The computer system of Example 4, the orientation of the individual's pulse corresponding to an angle between an individual's wrist and a surgical instrument wielded by the individual.
[00340] [00340] Example 6. The computer system of any of Examples 1 to 5, with the baseline physical characteristic comprising a previously recorded instance of the physical characteristic for the individual.
[00341] [00341] Example 7. The computer system of any of Examples 1 to 6, the memory additionally stores instructions that, when executed by the processor, cause the computer system to provide a notification according to the deviation between the physical characteristic and the baseline physical characteristic.
[00342] [00342] “Example 8. The computer system of Example 7, since the computer system provides notification during a surgical procedure in which perioperative data is received.
[00343] [00343] “Example 9. A computer-implemented method to track an individual's physical characteristic. The method comprises: receiving, through a computer system, perioperative data from a surgical device; determine, through the computer system, a surgical context based, at least in part, on perioperative data; receive, by the computer system, an image of the individual through a camera connected in a way communicable to the computer system; determine, through the computer system, a physical characteristic of the individual from the image; recover, through the computer system, a physical baseline characteristic corresponding to the surgical context; and determining, through the computer system, whether the individual's physical characteristic deviates from the baseline physical characteristic.
[00344] [00344] Example 10. The computer implemented method of Example 9, the physical characteristic of which includes an individual's posture.
[00345] [00345] Example 11. The computer-implemented method of Example 10, where the individual's posture corresponds to a deviation of at least one body part position and one reference position.
[00346] [00346] Example 12. The computer implemented method of Example 9, with the physical characteristic comprising an orientation of the individual's pulse.
[00347] [00347] Example 13. The computer implemented method of Example 12, with the orientation of the individual's pulse corresponding to an angle between an individual's pulse and a surgical instrument wielded by the individual.
[00348] [00348] “Example 14. The computer-implemented method of any of Examples 9 to 13, the baseline physical characteristic of which comprises a previously recorded instance of the physical characteristic for the individual.
[00349] [00349] Example 15. The computer implemented method of any of Examples 9 to 14, further comprising providing, through the computer system, a notification on a screen according to the deviation between the physical characteristic and the baseline physical characteristic.
[00350] [00350] Example 16. A computer system configured to be communicatively coupled to a surgical device and a camera. The computer system comprises a processor and a memory coupled to the processor. The memory stores instructions that, when executed by the processor, make the computer system: receive perioperative data from the device.
[00351] [00351] Example 17. The computer system of Example 16, the remote computer system comprising a cloud-based computer system.
[00352] [00352] Example 18. The computer system of Example 16 or 17, the physical characteristic of which comprises an individual's posture.
[00353] [00353] Example 19. The computer system of Example 18, where the individual's posture corresponds to a deviation from at least one position of the body part and one reference position.
[00354] [00354] Example 20.The computer system of Example 16 or 17, the physical characteristic comprising an orientation of the individual's pulse.
[00355] [00355] Example 21.The computer system of Example 20, the orientation of the individual's pulse corresponding to an angle between an individual's wrist and a surgical instrument wielded by the individual.
[00356] [00356] Although several forms have been illustrated and described, it is not the applicant's intention to restrict or limit the scope of the claims.
[00357] [00357] The previous detailed description presented various forms of the devices and / or processes through the use of block diagrams, flowcharts and / or examples. Although these block diagrams, flowcharts and / or examples contain one or more functions and / or operations, it will be understood by those skilled in the art that each function and / or operation within these block diagrams, flowcharts and / or examples can be implemented , individually and / or collectively, through a wide range of hardware, software, firmware or almost any combination thereof. Those skilled in the art will recognize, however, that some aspects of the aspects disclosed here, in whole or in part, can be implemented in an equivalent way in integrated circuits, such as one or more computer programs running on one or more computers (for example , such as one or more programs running on one or more computer systems), as one or more programs running on one or more processors (for example, as one or more programs running on one or more microprocessors), as firmware, or virtually like any combination thereof, and that designing the set of circuits and / or writing the code for the software and firmware would be within the scope of practice of those skilled in the art, in light of this disclosure. In addition, those skilled in the art will understand that the mechanisms of the subject described herein can be distributed as one or more program products in a variety of ways and that an illustrative form of the subject described herein is applicable regardless of the specific type of program. means of signal transmission used to effectively carry out the distribution.
[00358] [00358] The instructions used to program the logic to execute various revealed aspects can be stored in a memory in the system, such as dynamic random access memory (DRAM), cache, flash memory or other storage. In addition, instructions can be distributed over a network or other computer-readable media. In this way, machine-readable media can include any mechanism to store or transmit information in a machine-readable form (for example, a computer), but is not limited to, floppy disks, optical discs, compact memory disc read-only (CD-ROMs), and optical-dynamo discs, read-only memory (ROM), random access memory (RAM), erasable programmable read-only memory (EPROM), erasable programmable read-only memory electrically (EEPROM), magnetic or optical cards, flash memory, or a machine-readable tangible storage medium used to transmit information over the Internet via an electrical, optical, acoustic cable or other forms of propagated signals ( for example, carrier waves, infrared signal, digital signals, etc.). Consequently, computer-readable non-transitory media includes any type of machine-readable media suitable for storing or transmitting instructions or electronic information in a machine-readable form (for example, a computer).
[00359] [00359] As used in any aspect of the present invention, the term "control circuit" can refer to, for example, a set of wired circuits, programmable circuits (for example, a computer processor which includes one or more individual instruction processing cores, processing unit, processor, microcontroller, microcontroller unit, controller, digital signal processor (DSP), programmable logic device (PLD), programmable logic matrix (PLA), or field programmable port arrangement (FPGA)), state machine circuits, firmware that stores instructions performed by the programmable circuit, and any combination thereof. The control circuit can, collectively or individually, be incorporated as an electrical circuit that is part of a larger system, for example, an integrated circuit (IC), an application-specific integrated circuit (ASIC), an on-chip system (SoC ), desktop computers, laptop computers, tablet computers, servers, smart headsets, etc. Consequently, as used in the present invention, "control circuit" includes, but is not limited to, electrical circuits that have at least one discrete electrical circuit, electrical circuits that have at least one integrated circuit, electrical circuits that have at least one circuit integrated for specific application, electrical circuits that form a general purpose computing device configured by a computer program (for example, a general purpose computer configured by a computer program that at least partially performs processes and / or devices described herein, or a microprocessor configured by a computer program that at least partially performs the processes and / or devices described here), electrical circuits that form a memory device (for example, forms of random access memory), and / or electrical circuits that form a communications device (for example, a modem, communication switch, u optical-electrical equipment). Those skilled in the art will recognize that the subject described here can be implemented in an analog or digital way, or in some combination of these.
[00360] [00360] As used in any aspect of the present invention, the term "logical" can refer to an application, software, firmware and / or circuit configured to perform any of the aforementioned operations. The software can be incorporated as a software package, code, instructions, instruction sets and / or data recorded on the computer-readable non-transitory storage media. The firmware can be incorporated as code, instructions or instruction sets and / or data that are hard-coded (for example, non-volatile) in memory devices.
[00361] [00361] As used in any aspect of the present invention, the terms "component", "system", "module" and the like may refer to a computer-related entity, be it hardware, a combination of hardware and software, software or running software.
[00362] [00362] As used here in one aspect of the present invention, an "algorithm" refers to the self-consistent sequence of steps that lead to the desired result, where a "step" refers to the manipulation of physical quantities and / or logical states that can, although they do not necessarily need to, take the form of electrical or magnetic signals that can be stored, transferred, combined, compared and manipulated in any other way. It is common use to call these signs bits, values, elements, symbols, characters, terms, numbers or the like. These terms and similar terms can be associated with adequate physical quantities and are merely convenient identifications applied to these quantities and / or states.
[00363] [00363] “A network can include a packet-switched network. Communication devices may be able to communicate with each other using a selected packet switched network communications protocol.
[00364] [00364] Unless stated otherwise, as is evident from the previous disclosure, it is understood that, throughout the previous disclosure, discussions that use terms such as "processing", or "computation", or "calculation ", or" determination ", or" display ", or similar, refers to the action and processes of a computer, or similar electronic computing device, that manipulates and transforms the data represented in the form of physical (electronic) quantities in the computer's records and memories in other data represented in a similar way in the form of physical quantities in the computer's memories or records, or in other similar information storage, transmission or display devices.
[00365] [00365] One or more components in the present invention may be called "configured for", "configurable for", "operable / operational for", "adapted / adaptable for", "capable of", "con- formable / conformed to ", etc. Those skilled in the art will recognize that "configured for" may, in general, include components in an active state and / or components in an inactive state and / or components in a standby state, except when the context determines otherwise .
[003668] [003668] The terms "proximal" and "distal" are used in the present invention with reference to a physician who handles the handle portion of a surgical instrument. The term "proximal" refers to the portion closest to the doctor, and the term "distal" refers to the portion located in the opposite direction to the doctor. It will also be understood that, for the sake of convenience and clarity, spatial terms such as "vertical", "horizontal", "up" and "down" can be used in the present invention with respect to the drawings. However, surgical instruments can be used in many orientations and positions, and these terms are not intended to be limiting and / or absolute.
[00367] [00367] Persons skilled in the art will recognize that, in general, the terms used here, and especially in the appended claims (for example, bodies of the appended claims) are generally intended as "open" terms (for example, the term "including" should be interpreted as "including, but not limited to", the term "having" should be interpreted as "having, at least", the term "includes" should be interpreted as "includes, but not limits to ", etc.). It will also be understood by those skilled in the art that, when a specific number of a claim statement entered is intended, that intention will be expressly mentioned in the claim and, in the absence of such a claim, no intention will be present. For example, as an aid to understanding, the following appended claims may contain the use of the introductory phrases "at least one" and "one or more" to introduce claim statements. However, the use of such phrases should not be construed as implying that the introduction of a claim statement by the indefinite articles "one, ones" or "one, ones" limits any specific claim containing the claim mention. introduced to claims that contain only such a mention, even when the same claim includes the introductory phrases "one or more" or "at least one" and indefinite articles, such as "one, ones" or "one, ones" (for example, "one, ones" and / or "one, ones" should typically be interpreted as meaning "not one" or "one or more"); the same goes for the use of defined articles used to introduce claims.
[00368] [00368] Furthermore, even if a specific number of an introduced claim statement is explicitly mentioned, those skilled in the art will recognize that that statement needs to be typically interpreted as meaning at least the number mentioned (for example, the mere mention of "two mentions", without other modifiers, typically means at least two mentions, or two or more mentions). In addition, in cases where a convention analogous to "at least one of A, B and C, etc." is used, in general this construction is intended to have the meaning in which the convention would be understood by (for example, For example, "a system that has at least one of A, B and C" would include, but not be limited to, systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and / or A, B and C together, etc.). In cases where a convention analogous to "at least one of A, B or C, etc." is used, this construct is generally intended to have the meaning in which the convention would be understood by (for example, "a system that has at least one of A, B and C" would include, but not be limited to, systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and / or A, B and C together, etc.). It will be further understood by those skilled in the art that typically a disjunctive word and / or phrase presenting two or more alternative terms, whether in the description, in the claims or in the drawings, should be understood as contemplating the possibility of including one of the terms, any one of the terms or both terms, except when the context dictates to indicate something different. For example, the phrase "A or B" will typically be understood to include the possibilities of "A" or "B" or "AeB".
[00369] [00369] With respect to the attached claims, those skilled in the art will understand that the operations mentioned in the same can, in general, be performed in any order. In addition, although several operational flow diagrams are presented in one or more sequences, it must be understood that the various operations can be performed in other orders than those shown, or can be performed simultaneously. Examples of these alternative orderings may include overlapping, interspersed, interrupted, reordered,
[00370] [00370] It is worth noting that any reference to "one (1) aspect", "one aspect", "an exemplification" or "one (1) exemplification" ", and the like means that a particular feature, structure or characteristic described in connection with the aspect it is included in at least one aspect. Thus, the use of expressions such as "in one (1) aspect", "in one aspect", "in an example", "in one (1) example", in several places throughout this specification it does not necessarily refer to the same aspect. In addition, specific resources, structures or characteristics can be combined in any appropriate way in one or more aspects.
[00371] [00371] Any patent application, patent, publication of non-patent or other description material mentioned in this descriptive report and / or mentioned in any order data sheet is incorporated here by way of reference, until the point where the materials incorporated are not inconsistent with this. Accordingly, and to the extent necessary, the disclosure as explicitly presented herein replaces any conflicting material incorporated by reference to the present invention. Any material, or portion thereof, that is incorporated herein by reference, but which conflicts with the definitions, statements, or other disclosure materials presented herein, will be incorporated here only to the extent that it does not there is a conflict between the embedded material and the existing disclosure material.
[00372] [00372] In short, numerous benefits have been described that result from the use of the concepts described in this document.
The previously mentioned description of one or more modalities has been presented for purposes of illustration and description.
This description is not intended to be exhaustive or to limit the invention to the precise form disclosed.
Modifications or variations are possible in light of the above teachings.
One or more modalities were chosen and described in order to illustrate the principles and practical application to, thus, allow those skilled in the art to use the various modalities and with various modifications, as they are convenient to the specific use contemplated.
It is intended that the claims presented in the annex define the global scope.
权利要求:
Claims (21)
[1]
1. Computer system configured to be connected in a communicable way to a surgical device and a camera, in which the computer system is characterized by comprising: a processor; and a memory attached to the processor, in which the memory stores instructions that, when executed by the processor, cause the computer system to: receive perioperative data from the surgical device; determine a surgical context based, at least in part, on perioperative data; receive an image of an individual through the camera; determine a physical characteristic of the individual from the image; recover a baseline physical characteristic corresponding to the surgical context; and determine whether the individual's physical characteristic deviates from the baseline physical characteristic.
[2]
2. Computer system, according to claim 1, characterized in that the physical characteristic comprises an individual's posture.
[3]
3. Computer system according to claim 2, characterized in that the individual's posture corresponds to a deviation from at least one position of the body part and one reference position.
[4]
4, Computer system, according to claim 1, characterized in that the physical characteristic comprises an orientation of the individual's pulse.
[5]
5. Computer system according to claim 4,
characterized by the orientation of the individual's wrist corresponding to an angle between an individual's wrist and a surgical instrument held by the individual.
[6]
6. Computer system according to claim 1, characterized in that the physical characteristic of the baseline comprises a previously registered instance of the physical characteristic for the individual.
[7]
7. Computer system according to claim 1, characterized in that the memory additionally stores instructions that, when executed by the processor, cause the computer system to provide a notification according to the deviation between the physical characteristic and the physical characteristic baseline.
[8]
8. Computer system according to claim 7, wherein the computer system is characterized by providing notification during a surgical procedure in which perioperative data is received.
[9]
9. Method implemented by computer to track an individual's physical characteristic, in which the method is characterized by understanding: receiving, through a computer system, the perioperative data from a surgical device; determine, by means of the computer system, a surgical context based, at least in part, on perioperative data; receive, by the computer system, an image of the individual through a camera connected in a way communicable to the computer system; determine, through the computer system, a physical characteristic of the individual from the image; recover, through the computer system, a
baseline physical characteristics corresponding to the surgical context; and to determine, through the computer system, if the physical characteristic of the individual deviates from the physical characteristic of baseline.
[10]
10. Method implemented by computer, according to claim 9, characterized in that the physical characteristic comprises an individual's posture.
[11]
11. Method implemented by computer, according to claim 10, characterized in that the individual's posture corresponds to a deviation of at least one position of the body part and a reference position.
[12]
12. Method implemented by computer, according to claim 9, characterized in that the physical characteristic comprises an orientation of the individual's pulse.
[13]
13. Method implemented by computer, according to claim 12, characterized in that the orientation of the individual's pulse corresponds to an angle between an individual's wrist and a surgical instrument wielded by the individual.
[14]
14. Method implemented by computer, according to claim 9, characterized in that the physical characteristic of the baseline comprises a previously registered instance of the physical characteristic for the individual.
[15]
15. Method implemented by computer, according to claim 9, characterized by comprising additionally providing, through the computer system, a notification on a screen according to the deviation between the physical characteristic and the physical characteristic of the line base.
[16]
16. Computer system configured to be communicably coupled to a surgical device and a camera, in which the computer system is characterized by comprising: a processor; and a memory attached to the processor, in which the memory stores instructions that, when executed by the processor, cause the computer system to: receive perioperative data from the surgical device; determine a surgical context based, at least in part, on perioperative data; receive an image of an individual through the camera; determine a physical characteristic of the individual from the image; transmit the identification data of the physical characteristic and the surgical context to a remote computer system; wherein the remote computer system determines a physical baseline characteristic corresponding to the surgical context and the physical characteristic according to the data aggregated from a plurality of computer systems connected to the remote computer system; and receive, from the remote computer system, the deviation between the individual's physical characteristic and the baseline physical characteristic.
[17]
17. Computer system according to claim 16, characterized in that the remote computer system comprises a cloud computing system.
[18]
18. Computer system according to claim 16, characterized in that the physical characteristic comprises an individual's posture.
[19]
19. Computer system, according to claim 18, characterized in that the individual's posture corresponds to a deviation from at least one position of the body part and one reference position.
[20]
20. Computer system according to claim 16, characterized in that the physical characteristic comprises an orientation of the individual's pulse.
[21]
21. Computer system according to claim 20, characterized in that the orientation of the individual's pulse corresponds to an angle between an individual's wrist and a surgical instrument held by the individual.
o O =) = and ES o Zo Em o
HE 2% Nr s s o - ES) Fm me O col E-Oo - | DE 8 25 O | Mon
IS
EO no e Í e o ”= az Yo <FE Healthy
FER so x> o q [88 o | . : o a a: O o “ocnaanaeo = Le Ee
ES ===
ZE
EE 2 = s and oO r o i & z8 8 ã & s O 32 o sa 2É O | be in is
ES the 8 is and
O o uu 8 Es ts | 532>
S S BPI o S NÃ, - / - / KALE =: D X 0: 8 í Tm s À FAL.
NR | TT 1 | The.
AN] [| mM NR PN [ES ECO b AIAN E) and O to 8 ”
UNR MONITOR 135 MODULE -106
IMAGE 158 | - SYSTEM OF
GENERATOR MODULE VIEW 140 | 108 14 -. 143
SYSTEM
ROBOTIC EVACUATION MODULE 126 OF SMOKE 110
128 SUCCIONRIGATION MODULE | |
H
H
H
MODULE OF
INSTRUMENT 1 So INTERSENT N2 MODULE 132 | 136 MATRIX PROCESSOR 134 STORAGE
MODULE OF
MAPPING OPERATING ROOM 133
类似技术:
公开号 | 公开日 | 专利标题
BR112020013047A2|2020-12-01|utilization and technical analysis of the surgeon&#39;s / team&#39;s performance against a baseline to optimize the use and performance of the device, for both current and future procedures
BR112020013177A2|2020-12-01|surgical network recommendations based on real-time analysis of procedure variables in relation to a baseline highlighting differences in relation to the optimal solution
BR112020013175A2|2020-12-01|imaging of areas outside the abdomen to improve placement and control of an in-use surgical device
BR112020013196A2|2020-12-01|position detection and patient contact with the use of the monopolar return block electrode to provide situational recognition to the central controller
BR112020013098A2|2020-11-24|determination of prioritization of communication, interaction or processing of surgical network based on system or device needs
BR112020013013A2|2020-11-24|surgical systems with autonomously adjustable control programs
US20190200980A1|2019-07-04|Surgical system for presenting information interpreted from external data
US10943454B2|2021-03-09|Detection and escalation of security responses of surgical instruments to increasing severity threats
BR112020013169A2|2020-12-01|surgical tool equipped with motor with predefined adjustable control algorithm to control end actuator parameters
US20210212694A1|2021-07-15|Method for facility data collection and interpretation
BR112020013079A2|2020-12-01|wirelessly pairing a surgical device with another device within a sterile surgical field based on the use and situational recognition of devices
BR112020013241A2|2020-12-01|control of a surgical system through a surgical barrier
BR112020013112A2|2020-11-24|comprehensive real-time analysis of all instrumentation used in surgery with the use of fluid data to track instruments through storage and internal processes
BR112020013228A2|2020-12-01|data communication in which a surgical network uses context of the data and requirements of a receiver / user system to influence the inclusion or link of data and metadata to establish continuity
US20210192914A1|2021-06-24|Surgical hub and modular device response adjustment based on situational awareness
BR112020013162A2|2020-12-01|interactive surgical system
US11278281B2|2022-03-22|Interactive surgical system
BR112020013087A2|2020-12-01|detection and escalation of surgical instrument safety responses to threats of increasing severity
BR112020013031A2|2020-11-24|response adjustment of modular device and central surgical controller based on situational recognition
BR112020012957A2|2020-12-01|surgical system to present information interpreted from external data
BR112020013176A2|2020-12-01|adjustments based on the properties of airborne particles
BR112020013024A2|2020-11-24|adjustment of device control programs based on stratified contextual data in addition to the data
BR112020013021A2|2020-11-24|adjustment of a function of the surgical device based on situational recognition
BR112020013229A2|2020-12-01|surgical network, instrument and cloud responses based on validation of received data set and authentication of its source and integrity
同族专利:
公开号 | 公开日
JP2021509333A|2021-03-25|
EP3506302A1|2019-07-03|
CN111788637A|2020-10-16|
WO2019133128A1|2019-07-04|
US20190201126A1|2019-07-04|
US20210212774A1|2021-07-15|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

AUPQ600100A0|2000-03-03|2000-03-23|Macropace Products Pty. Ltd.|Animation technology|
US7995045B2|2007-04-13|2011-08-09|Ethicon Endo-Surgery, Inc.|Combined SBI and conventional image processor|
US7982776B2|2007-07-13|2011-07-19|Ethicon Endo-Surgery, Inc.|SBI motion artifact removal apparatus and method|
EP2391259A1|2009-01-30|2011-12-07|The Trustees Of Columbia University In The City Of New York|Controllable magnetic source to fixture intracorporeal apparatus|
JP5719819B2|2012-09-28|2015-05-20|日本光電工業株式会社|Surgery support system|
KR101451970B1|2013-02-19|2014-10-23|주식회사 루트로닉|An ophthalmic surgical apparatus and an method for controlling that|
US10098527B2|2013-02-27|2018-10-16|Ethidcon Endo-Surgery, Inc.|System for performing a minimally invasive surgical procedure|
JP6657933B2|2015-12-25|2020-03-04|ソニー株式会社|Medical imaging device and surgical navigation system|
JPWO2017169823A1|2016-03-30|2019-02-07|ソニー株式会社|Image processing apparatus and method, surgical system, and surgical member|US11103268B2|2017-10-30|2021-08-31|Cilag Gmbh International|Surgical clip applier comprising adaptive firing control|
US11141160B2|2017-10-30|2021-10-12|Cilag Gmbh International|Clip applier comprising a motor controller|
US11229436B2|2017-10-30|2022-01-25|Cilag Gmbh International|Surgical system comprising a surgical tool and a surgical hub|
US11253315B2|2017-12-28|2022-02-22|Cilag Gmbh International|Increasing radio frequency to create pad-less monopolar loop|
US11109866B2|2017-12-28|2021-09-07|Cilag Gmbh International|Method for circular stapler control algorithm adjustment based on situational awareness|
US20190201146A1|2017-12-28|2019-07-04|Ethicon Llc|Safety systems for smart powered surgical stapling|
US11051876B2|2017-12-28|2021-07-06|Cilag Gmbh International|Surgical evacuation flow paths|
US20190274716A1|2017-12-28|2019-09-12|Ethicon Llc|Determining the state of an ultrasonic end effector|
US11166772B2|2017-12-28|2021-11-09|Cilag Gmbh International|Surgical hub coordination of control and communication of operating room devices|
US11234756B2|2017-12-28|2022-02-01|Cilag Gmbh International|Powered surgical tool with predefined adjustable control algorithm for controlling end effector parameter|
US10849697B2|2017-12-28|2020-12-01|Ethicon Llc|Cloud interface for coupled surgical devices|
US10966791B2|2017-12-28|2021-04-06|Ethicon Llc|Cloud-based medical analytics for medical facility segmented individualization of instrument function|
US11069012B2|2017-12-28|2021-07-20|Cilag Gmbh International|Interactive surgical systems with condition handling of devices and data capabilities|
US11045591B2|2017-12-28|2021-06-29|Cilag Gmbh International|Dual in-series large and small droplet filters|
US11266468B2|2017-12-28|2022-03-08|Cilag Gmbh International|Cooperative utilization of data derived from secondary sources by intelligent surgical hubs|
US10892995B2|2017-12-28|2021-01-12|Ethicon Llc|Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs|
US20190206551A1|2017-12-28|2019-07-04|Ethicon Llc|Spatial awareness of surgical hubs in operating rooms|
US10987178B2|2017-12-28|2021-04-27|Ethicon Llc|Surgical hub control arrangements|
US20190201087A1|2017-12-28|2019-07-04|Ethicon Llc|Smoke evacuation system including a segmented control circuit for interactive surgical platform|
US11213359B2|2017-12-28|2022-01-04|Cilag Gmbh International|Controllers for robot-assisted surgical platforms|
US11096693B2|2017-12-28|2021-08-24|Cilag Gmbh International|Adjustment of staple height of at least one row of staples based on the sensed tissue thickness or force in closing|
US10695081B2|2017-12-28|2020-06-30|Ethicon Llc|Controlling a surgical instrument according to sensed closure parameters|
US10944728B2|2017-12-28|2021-03-09|Ethicon Llc|Interactive surgical systems with encrypted communication capabilities|
US11202570B2|2017-12-28|2021-12-21|Cilag Gmbh International|Communication hub and storage device for storing parameters and status of a surgical device to be shared with cloud based analytics systems|
US11076921B2|2017-12-28|2021-08-03|Cilag Gmbh International|Adaptive control program updates for surgical hubs|
US10758310B2|2017-12-28|2020-09-01|Ethicon Llc|Wireless pairing of a surgical device with another device within a sterile surgical field based on the usage and situational awareness of devices|
US11257589B2|2017-12-28|2022-02-22|Cilag Gmbh International|Real-time analysis of comprehensive cost of all instrumentation used in surgery utilizing data fluidity to track instruments through stocking and in-house processes|
US11132462B2|2017-12-28|2021-09-28|Cilag Gmbh International|Data stripping method to interrogate patient records and create anonymized record|
US20190205001A1|2017-12-28|2019-07-04|Ethicon Llc|Sterile field interactive control displays|
US11056244B2|2017-12-28|2021-07-06|Cilag Gmbh International|Automated data scaling, alignment, and organizing based on predefined parameters within surgical networks|
US11160605B2|2017-12-28|2021-11-02|Cilag Gmbh International|Surgical evacuation sensing and motor control|
US11100631B2|2017-12-28|2021-08-24|Cilag Gmbh International|Use of laser light and red-green-blue coloration to determine properties of back scattered light|
US11179208B2|2017-12-28|2021-11-23|Cilag Gmbh International|Cloud-based medical analytics for security and authentication trends and reactive measures|
US11147607B2|2017-12-28|2021-10-19|Cilag Gmbh International|Bipolar combination device that automatically adjusts pressure based on energy modality|
US11013563B2|2017-12-28|2021-05-25|Ethicon Llc|Drive arrangements for robot-assisted surgical platforms|
US10892899B2|2017-12-28|2021-01-12|Ethicon Llc|Self describing data packets generated at an issuing instrument|
US10943454B2|2017-12-28|2021-03-09|Ethicon Llc|Detection and escalation of security responses of surgical instruments to increasing severity threats|
US10932872B2|2017-12-28|2021-03-02|Ethicon Llc|Cloud-based medical analytics for linking of local usage trends with the resource acquisition behaviors of larger data set|
US11259830B2|2018-03-08|2022-03-01|Cilag Gmbh International|Methods for controlling temperature in ultrasonic device|
US20190298350A1|2018-03-28|2019-10-03|Ethicon Llc|Methods for controlling a powered surgical stapler that has separate rotary closure and firing systems|
US11207067B2|2018-03-28|2021-12-28|Cilag Gmbh International|Surgical stapling device with separate rotary driven closure and firing systems and firing member that engages both jaws while firing|
US11090047B2|2018-03-28|2021-08-17|Cilag Gmbh International|Surgical instrument comprising an adaptive control system|
US10973520B2|2018-03-28|2021-04-13|Ethicon Llc|Surgical staple cartridge with firing member driven camming assembly that has an onboard tissue cutting feature|
US11213294B2|2018-03-28|2022-01-04|Cilag Gmbh International|Surgical instrument comprising co-operating lockout features|
US11096688B2|2018-03-28|2021-08-24|Cilag Gmbh International|Rotary driven firing members with different anvil and channel engagement features|
US11219453B2|2018-03-28|2022-01-11|Cilag Gmbh International|Surgical stapling devices with cartridge compatible closure and firing lockout arrangements|
US11197668B2|2018-03-28|2021-12-14|Cilag Gmbh International|Surgical stapling assembly comprising a lockout and an exterior access orifice to permit artificial unlocking of the lockout|
US11166716B2|2018-03-28|2021-11-09|Cilag Gmbh International|Stapling instrument comprising a deactivatable lockout|
US11259807B2|2019-02-19|2022-03-01|Cilag Gmbh International|Staple cartridges with cam surfaces configured to engage primary and secondary portions of a lockout of a surgical stapling device|
法律状态:
2021-12-07| B350| Update of information on the portal [chapter 15.35 patent gazette]|
优先权:
申请号 | 申请日 | 专利标题
US201762611340P| true| 2017-12-28|2017-12-28|
US201762611339P| true| 2017-12-28|2017-12-28|
US201762611341P| true| 2017-12-28|2017-12-28|
US62/611,341|2017-12-28|
US62/611,339|2017-12-28|
US62/611,340|2017-12-28|
US201862640417P| true| 2018-03-08|2018-03-08|
US201862640415P| true| 2018-03-08|2018-03-08|
US62/640,415|2018-03-08|
US62/640,417|2018-03-08|
US201862650877P| true| 2018-03-30|2018-03-30|
US201862650882P| true| 2018-03-30|2018-03-30|
US201862650887P| true| 2018-03-30|2018-03-30|
US201862650898P| true| 2018-03-30|2018-03-30|
US62/650,887|2018-03-30|
US62/650,877|2018-03-30|
US62/650,898|2018-03-30|
US62/650,882|2018-03-30|
US201862659900P| true| 2018-04-19|2018-04-19|
US62/659,900|2018-04-19|
US201862692768P| true| 2018-06-30|2018-06-30|
US201862692748P| true| 2018-06-30|2018-06-30|
US201862692747P| true| 2018-06-30|2018-06-30|
US62/692,747|2018-06-30|
US62/692,748|2018-06-30|
US62/692,768|2018-06-30|
US201862729191P| true| 2018-09-10|2018-09-10|
US62/729,191|2018-09-10|
US16/182,255|US20190201126A1|2017-12-28|2018-11-06|Usage and technique analysis of surgeon / staff performance against a baseline to optimize device utilization and performance for both current and future procedures|
US16/182,255|2018-11-06|
PCT/US2018/060958|WO2019133128A1|2017-12-28|2018-11-14|Usage and technique analysis of surgeon / staff performance against a baseline to optimize device utilization and performance for both current and future procedures|
[返回顶部]